1000 resultados para Software de gestión
Resumo:
En la actualidad, la gestión de proyectos de investigación en la Universidad Complutense de Madrid se encuentra parcialmente automatizada o soportada mediante el uso de aplicaciones informáticas. No obstante, los procesos de justificación más habituales en la gestión económica de los proyectos de investigación (gestión de pagos / compras, gestión y justificación de viajes y gestión de material inventariable) se lleva a cabo en papel y manualmente. GesInv nace con objeto de facilitar y agilizar estos procesos que suponen una gran carga de trabajo a los investigadores y al personal administrativo debido a la gran cantidad de trámites que se gestionan en la UCM. GesInv proporciona un portal web que integra a todos los actores que están involucrados en los mencionados procedimientos, permitiendo la tramitación electrónica que disminuye los tiempos de tramitación y disminuye los errores cometidos al trasladar la información del papel a digital (como en la actualidad se lleva a cabo).
Resumo:
117 p.
Resumo:
Este trabajo de investigación pretende poner de manifiesto la necesidad de gestionar de forma adecuada la tesorería que genera el ciclo de explotación de la empresa. Esta teso-rería es garantía de continuidad de la empresa, porque permite disponer de liquidez y obtener financiación a partir de la optimización de los elementos del negocio, especial-mente, en situaciones de crisis, con estancamiento o caída de las cifras de negocio, con recortes en el gasto público que afectan a la cadena de valor, con limitaciones al acceso al crédito externo, con socios y accionistas que buscan rentabilidad para sus inversiones sin tener que realizar nuevas aportaciones. Síntesis Para gestionar de forma adecuada la tesorería que genera el ciclo de explotación, es ne-cesario disponer de un modelo de optimización de las Necesidades Operativas de Fon-dos (NOF). Ante la ausencia de este tipo de modelos, este trabajo de investigación pre-tende construir y probar uno, como herramienta automatizada que simule y mida el im-pacto que tiene la inversión en las NOF sobre la solvencia, la rentabilidad y el valor de una empresa, para corregir un problema de indisponibilidad o de insuficiencia de liqui-dez. Para construir el modelo, se ha elegido Excel como software informático soporte. A partir de este software, se ha planteado la construcción del modelo siguiendo los si-guientes pasos: 1. Automatización del paso de los estados financieros contables a unos estados finan-cieros funcionales a partir la utilización de plantillas de toma de datos que permitan identificar y cuantificar las NOF. 2. Diseño de unas plantillas de simulación con alternativas generadoras de impactos sobre los estados financieros. Estas alternativas incluyen tanto elementos sin coste explícito como elementos con dicho coste. 3. Emisión automática de informes comparativos de la posición de la empresa antes y después de los impactos simulados sobre los estados financieros, para analizar la si-tuación de la misma en términos de equilibrio financiero, de solvencia, de rentabili-dad y de valor generado, incluyendo ratios que relacionan distintas magnitudes y permiten observar la gestión y la eficiencia operativa..
Resumo:
En la civilización de la antigua Roma, tres de los aspectos más importantes de la vida cotidiana estaban vinculados a la arquitectura: las termas, los acueductos y las tumbas. Esta investigación propone el estudio de la integración de sistemas avanzados para la documentación, gestión y valorización del patrimonio arquitectónico funerario de la Vie Latina y la Appia Antica, en estrecha relación con el tema del Paisaje Cultural. De hecho, el Parque de las Tumbas Latinas alberga uno de los complejos funerarios más importantes, que en la actualidad conserva el aspecto tradicional del antiguo paisaje romano. A lo largo de una vía empedrada, como todas las vías consulares, la Vía Latina (al igual que la Vía Appia Antica), que, como recordaba Tito Livio, conectaba en su día las ciudades de Roma con Capua, sigue manteniendo "congelado" el antiguo trazado urbano/paisajístico. El sistema multiforme del Ager Romanus y del sitio cultural Via Latina/Appia Antica estudiado en esta investigación es, por tanto, comparable a una estructura viva y dinámica y, como tal, debe ser analizada. Por lo tanto, para diseñar una herramienta de protección y gestión tan "potente" y adecuada para un sitio histórico de enorme importancia, fue necesario utilizar las técnicas de levantamiento arquitectónico más avanzadas que se utilizan actualmente (como el escaneo láser y la fotogrametría, junto con un software de análisis específico), acompañadas de un estudio en profundidad de las técnicas de construcción antiguas. El último aspecto clave que pretende abordar la investigación es la catalogación. De hecho, los sitios y monumentos históricos no se pueden mantener sólo mediante su uso y utilización pasiva, sino activando todas las operaciones de protección y conservación mediante intervenciones directas (mantenimiento/restauración) e indirectas, como la catalogación constante de las obras históricas y la consiguiente "catalogación dinámica".
Resumo:
This article aimed at comparing the accuracy of linear measurement tools of different commercial software packages. Eight fully edentulous dry mandibles were selected for this study. Incisor, canine, premolar, first molar and second molar regions were selected. Cone beam computed tomography (CBCT) images were obtained with i-CAT Next Generation. Linear bone measurements were performed by one observer on the cross-sectional images using three different software packages: XoranCat®, OnDemand3D® and KDIS3D®, all able to assess DICOM images. In addition, 25% of the sample was reevaluated for the purpose of reproducibility. The mandibles were sectioned to obtain the gold standard for each region. Intraclass coefficients (ICC) were calculated to examine the agreement between the two periods of evaluation; the one-way analysis of variance performed with the post-hoc Dunnett test was used to compare each of the software-derived measurements with the gold standard. The ICC values were excellent for all software packages. The least difference between the software-derived measurements and the gold standard was obtained with the OnDemand3D and KDIS3D (-0.11 and -0.14 mm, respectively), and the greatest, with the XoranCAT (+0.25 mm). However, there was no statistical significant difference between the measurements obtained with the different software packages and the gold standard (p> 0.05). In conclusion, linear bone measurements were not influenced by the software package used to reconstruct the image from CBCT DICOM data.
Resumo:
This paper presents SMarty, a variability management approach for UML-based software product lines (PL). SMarty is supported by a UML profile, the SMartyProfile, and a process for managing variabilities, the SMartyProcess. SMartyProfile aims at representing variabilities, variation points, and variants in UML models by applying a set of stereotypes. SMartyProcess consists of a set of activities that is systematically executed to trace, identify, and control variabilities in a PL based on SMarty. It also identifies variability implementation mechanisms and analyzes specific product configurations. In addition, a more comprehensive application of SMarty is presented using SEI's Arcade Game Maker PL. An evaluation of SMarty and related work are discussed.
Resumo:
Thousands of Free and Open Source Software Projects (FSP) were, and continually are, created on the Internet. This scenario increases the number of opportunities to collaborate to the same extent that it promotes competition for users and contributors, who can guide projects to superior levels, unachievable by founders alone. Thus, given that the main goal of FSP founders is to improve their projects by means of collaboration, the importance to understand and manage the capacity of attracting users and contributors to the project is established. To support researchers and founders in this challenge, the concept of attractiveness is introduced in this paper, which develops a theoretical-managerial toolkit about the causes, indicators and consequences of attractiveness, enabling its strategic management.
Resumo:
Objective To evaluate drug interaction software programs and determine their accuracy in identifying drug-drug interactions that may occur in intensive care units. Setting The study was developed in Brazil. Method Drug interaction software programs were identified through a bibliographic search in PUBMED and in LILACS (database related to the health sciences published in Latin American and Caribbean countries). The programs` sensitivity, specificity, and positive and negative predictive values were determined to assess their accuracy in detecting drug-drug interactions. The accuracy of the software programs identified was determined using 100 clinically important interactions and 100 clinically unimportant ones. Stockley`s Drug Interactions 8th edition was employed as the gold standard in the identification of drug-drug interaction. Main outcome Sensitivity, specificity, positive and negative predictive values. Results The programs studied were: Drug Interaction Checker (DIC), Drug-Reax (DR), and Lexi-Interact (LI). DR displayed the highest sensitivity (0.88) and DIC showed the lowest (0.69). A close similarity was observed among the programs regarding specificity (0.88-0.92) and positive predictive values (0.88-0.89). The DIC had the lowest negative predictive value (0.75) and DR the highest (0.91). Conclusion The DR and LI programs displayed appropriate sensitivity and specificity for identifying drug-drug interactions of interest in intensive care units. Drug interaction software programs help pharmacists and health care teams in the prevention and recognition of drug-drug interactions and optimize safety and quality of care delivered in intensive care units.
Resumo:
Support for interoperability and interchangeability of software components which are part of a fieldbus automation system relies on the definition of open architectures, most of them involving proprietary technologies. Concurrently, standard, open and non-proprietary technologies, such as XML, SOAP, Web Services and the like, have greatly evolved and been diffused in the computing area. This article presents a FOUNDATION fieldbus (TM) device description technology named Open-EDD, based on XML and other related technologies (XLST, DOM using Xerces implementation, OO, XMIL Schema), proposing an open and nonproprietary alternative to the EDD (Electronic Device Description). This initial proposal includes defining Open-EDDML as the programming language of the technology in the FOUNDATION fieldbus (TM) protocol, implementing a compiler and a parser, and finally, integrating and testing the new technology using field devices and a commercial fieldbus configurator. This study attests that this new technology is feasible and can be applied to other configurators or HMI applications used in fieldbus automation systems. (c) 2008 Elsevier B.V. All rights reserved.
Resumo:
This paper presents the proposal for a reference model for developing software aimed at small companies. Despite the importance of that represent the small software companies in Latin America, the fact of not having its own standards, and able to meet their specific, has created serious difficulties in improving their process and also in quality certification. In this sense and as a contribution to better understanding of the subject they propose a reference model and as a means to validate the proposal, presents a report of its application in a small Brazilian company, committed to certification of the quality model MPS.BR.
Resumo:
This report describes recent updates to the custom-built data-acquisition hardware operated by the Center for Hypersonics. In 2006, an ISA-to-USB bridging card was developed as part of Luke Hillyard's final-year thesis. This card allows the hardware to be connected to any recent personal computers via a (USB or RS232) serial port and it provides a number of simple text-based commands for control of the hardware. A graphical user interface program was also updated to help the experimenter manage the data acquisition functions. Sampled data is stored in text files that have been compressed with the gzip for mat. To simplify the later archiving or transport of the data, all files specific to a shot are stored in a single directory. This includes a text file for the run description, the signal configuration file and the individual sampled-data files, one for each signal that was recorded.
Resumo:
The XSophe-Sophe-XeprView((R)) computer simulation software suite enables scientists to easily determine spin Hamiltonian parameters from isotropic, randomly oriented and single crystal continuous wave electron paramagnetic resonance (CW EPR) spectra from radicals and isolated paramagnetic metal ion centers or clusters found in metalloproteins, chemical systems and materials science. XSophe provides an X-windows graphical user interface to the Sophe programme and allows: creation of multiple input files, local and remote execution of Sophe, the display of sophelog (output from Sophe) and input parameters/files. Sophe is a sophisticated computer simulation software programme employing a number of innovative technologies including; the Sydney OPera HousE (SOPHE) partition and interpolation schemes, a field segmentation algorithm, the mosaic misorientation linewidth model, parallelization and spectral optimisation. In conjunction with the SOPHE partition scheme and the field segmentation algorithm, the SOPHE interpolation scheme and the mosaic misorientation linewidth model greatly increase the speed of simulations for most spin systems. Employing brute force matrix diagonalization in the simulation of an EPR spectrum from a high spin Cr(III) complex with the spin Hamiltonian parameters g(e) = 2.00, D = 0.10 cm(-1), E/D = 0.25, A(x) = 120.0, A(y) = 120.0, A(z) = 240.0 x 10(-4) cm(-1) requires a SOPHE grid size of N = 400 (to produce a good signal to noise ratio) and takes 229.47 s. In contrast the use of either the SOPHE interpolation scheme or the mosaic misorientation linewidth model requires a SOPHE grid size of only N = 18 and takes 44.08 and 0.79 s, respectively. Results from Sophe are transferred via the Common Object Request Broker Architecture (CORBA) to XSophe and subsequently to XeprView((R)) where the simulated CW EPR spectra (1D and 2D) can be compared to the experimental spectra. Energy level diagrams, transition roadmaps and transition surfaces aid the interpretation of complicated randomly oriented CW EPR spectra and can be viewed with a web browser and an OpenInventor scene graph viewer.
Resumo:
Using Landsat imagery, forest canopy density (FCD) estimated with the FCD Mapper®, was correlated with predominant height (PDH, measured as the average height of the tallest 50 trees per hectare) for 20 field plots measured in native forest at Noosa Heads, south-east Queensland, Australia. A corresponding image was used to calculate FCD in Leyte Island, the Philippines and was validated on the ground for accuracy. The FCD Mapper was produced for the International Tropical Timber Organisation and estimates FCD as an index of canopy density using reflectance characteristics of Landsat Enhanced Thematic (ETM) Mapper images. The FCD Mapper is a ‘semi-expert’ computer program which uses interactive screens to allow the operator to make decisions concerning the classification of land into bare soil, grass and forest. At Noosa, a positive strong nonlinear relationship (r2 = 0.86) was found between FCD and PDH for 15 field plots with variable PDH but complete canopy closure. An additional five field plots were measured in forest with a broken canopy and the software assessed these plots as having a much lower FCD than forest with canopy closure. FCD estimates for forest and agricultural land in the island of Leyte and subsequent field validation showed that at appropriate settings, the FCD Mapper differentiated between tropical rainforest and banana or coconut plantation. These findings suggest that in forests with a closed canopy this remote sensing technique has promise for forest inventory and productivity assessment. The findings also suggest that the software has promise for discriminating between native forest with a complete canopy and forest which has a broken canopy, such as coconut or banana plantation.
Resumo:
Expokit provides a set of routines aimed at computing matrix exponentials. More precisely, it computes either a small matrix exponential in full, the action of a large sparse matrix exponential on an operand vector, or the solution of a system of linear ODEs with constant inhomogeneity. The backbone of the sparse routines consists of matrix-free Krylov subspace projection methods (Arnoldi and Lanczos processes), and that is why the toolkit is capable of coping with sparse matrices of large dimension. The software handles real and complex matrices and provides specific routines for symmetric and Hermitian matrices. The computation of matrix exponentials is a numerical issue of critical importance in the area of Markov chains and furthermore, the computed solution is subject to probabilistic constraints. In addition to addressing general matrix exponentials, a distinct attention is assigned to the computation of transient states of Markov chains.