887 resultados para Software Process Improvement


Relevância:

30.00% 30.00%

Publicador:

Resumo:

Los sistemas y las tecnologías de información son un elemento vital de las organizaciones exitosas. Por eso constituyen un área esencial de estudio en la administración y dirección de un negocio. Actualmente, la investigación sobre la TI y el desempeño de negocios ha descubierto que cuanto más tenga éxito una empresa para alinear la tecnología de la información con sus objetivos de negocios, mayor será su rentabilidad, y solo una cuarta parte de las empresas logran una alineación entre la TI y los negocios (Luffman, 2003) Esta fuerte tendencia ha tenido un auge interesante en compañías colombianas de todos los tamaños y sectores de la economía; por esta razón hemos buscado abordar una compañía colombiana de talla internacional para describir el proceso de implementación de su sistema de gestión de recursos empresariales (ERP) SAP ECC y su integración dentro de General Motors Colmotores; principalmente con el fin de realizar un análisis del uso que la compañía le está dando a los principales módulos y sub-módulos de contabilidad, ventas y compras de SAP. Este proyecto es muy pertinente especialmente para todos aquellos estudiantes e interesados en el uso y aprovechamiento de los sistemas de información de una compañía, por lo cual se sintetizara la experiencia durante su preparación para la implementación, el desarrollo de la misma y los principales resultados, percibidos por los empleados directamente involucrados, los cuales serán abordados por medio de visitas programadas y entrevistas a profundidad; esto facilitara y permitirá evidenciar de cerca la realidad de la compañía General Motors Colmotores con SAP.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

El objetivo de este trabajo es hacer un estudio sobre la cadena de suministros en organizaciones empresariales desde la Dinámica de Sistemas y como esta puede aportar al desempeño y el control de las cadenas de suministros. Se buscará Abordar el cocimiento sobre tres perspectivas de Supply Chain y su relación con la dinámica de sistemas. También se buscará identificar los tipos de integración en las actividades de la gestión en la cadena de suministros y sus horizontes de planeación. Por último, se pretende analizar las aplicaciones de Supply Chain Management que se han basado en el uso de la metodología de dinámica de sistemas. Para esto, la investigación empezará por definir la problemática alrededor de unir estas dos áreas y definirá el marco teórico que fundan estas dos disciplinas. Luego se abordará la metodología usada por la Dinámica de Sistemas y los diferentes aspectos de la cadena de suministros. Se Ahondará en el acercamiento de las dos disciplinas y como convergen ayudando la SD a la SCM (Supply Chain Management). En este punto también se describirán los trabajos en los diferentes enfoques que se han hecho a partir de uso de la dinámica de sistemas. Por último, presentaremos las correspondientes conclusiones y comentarios acerca de este campo de investigación y su pertinencia en el campo de la Supply Chain. Esta investigación abarca dos grandes corrientes de pensamiento, una sistémica, a través de la metodología de dinámica de sistemas y la otra, lógico analítica la cual es usada en Supply Chain. Se realizó una revisión de la literatura sobre las aplicaciones de dinámica de sistemas (SD) en el área de Supply Chain, sus puntos en común y se documentaron importantes empleos de esta metodología que se han hecho en la gestión de la cadena de suministros.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Los Sistemas de Información Geográfica (SIG) son una herramienta válida para el estudio de los paisajes antiguos. Los SIG se pueden configurar como un conjunto de medios analíticos útiles para comprender la dimensión espacial de las formaciones sociales y su dinámica histórica. En otras palabras, los SIG posibilitan un acercamiento válido a la racionalidad de las conductas espaciales de una comunidad y a las pautas globales de una sociedad que quedan plasmadas en la morfología de un paisaje. Atendiendo a la abundante y creciente oferta de programas informáticos que procesan y analizan información espacial, enfocaremos las ventajas que supone la adopción de soluciones libres y de código abierto para la investigación arqueológica de los paisajes. Como ejemplo presentaremos el modelado coste-distancia aplicado a un problema locacional arqueológico: la evaluación de la ubicación de los asentamientos respecto a los recursos disponibles en su entorno. El enfoque experimental ha sido aplicado al poblamiento castreño de la comarca de La Cabrera (León). Se presentará una descripción detallada de cómo crear tramos isócronos basados en el cálculo de los costes anisótropos inherentes a la locomoción pedestre. Asimismo, la ventaja que supone la adopción del SIG GRASS para la implementación del análisis

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Virtual tools are commonly used nowadays to optimize product design and manufacturing process of fibre reinforced composite materials. The present work focuses on two areas of interest to forecast the part performance and the production process particularities. The first part proposes a multi-physical optimization tool to support the concept stage of a composite part. The strategy is based on the strategic handling of information and, through a single control parameter, is able to evaluate the effects of design variations throughout all these steps in parallel. The second part targets the resin infusion process and the impact of thermal effects. The numerical and experimental approach allowed the identificationof improvement opportunities regarding the implementation of algorithms in commercially available simulation software.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

[EU]Lan honen helburu nagusia Gorka Goikoetxearen etxebizitzaren ziurtagiri energetikoa lortzea da eta, hortik abiatuz, hobekuntza batzuen proposamena eta analisi ekonomikoa. Honetarako CE3X programa erabiliko da, datuak sartu hutsez kalifikazio zenbakia eta hizkia emango du eta ondoren hobekuntza aukera desberdinak eskainiko ditu. Hauek balioztatzeko analisi ekonomiko bat egingo da. Txosten honetan ziurtagiria lortzeko beharrezko urratsak azalduko dira, hobekuntzen deskripzioa eta kalifikazio berriak ere bai. Eranskinetan gehigarrizko informazioa erakutsiko da, prozesuaren ulermena errazteko.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Context: Learning can be regarded as knowledge construction in which prior knowledge and experience serve as basis for the learners to expand their knowledge base. Such a process of knowledge construction has to take place continuously in order to enhance the learners’ competence in a competitive working environment. As the information consumers, the individual users demand personalised information provision which meets their own specific purposes, goals, and expectations. Objectives: The current methods in requirements engineering are capable of modelling the common user’s behaviour in the domain of knowledge construction. The users’ requirements can be represented as a case in the defined structure which can be reasoned to enable the requirements analysis. Such analysis needs to be enhanced so that personalised information provision can be tackled and modelled. However, there is a lack of suitable modelling methods to achieve this end. This paper presents a new ontological method for capturing individual user’s requirements and transforming the requirements onto personalised information provision specifications. Hence the right information can be provided to the right user for the right purpose. Method: An experiment was conducted based on the qualitative method. A medium size of group of users participated to validate the method and its techniques, i.e. articulates, maps, configures, and learning content. The results were used as the feedback for the improvement. Result: The research work has produced an ontology model with a set of techniques which support the functions for profiling user’s requirements, reasoning requirements patterns, generating workflow from norms, and formulating information provision specifications. Conclusion: The current requirements engineering approaches provide the methodical capability for developing solutions. Our research outcome, i.e. the ontology model with the techniques, can further enhance the RE approaches for modelling the individual user’s needs and discovering the user’s requirements.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Three naming strategies are discussed that allow the processes of a distributed application to continue being addressed by their original logical name, along all the migrations they may be forced to undertake because of performance-improvement goals. A simple centralised solution is firstly discussed which showed a software bottleneck with the increase of the number of processes; other two solutions are considered that entail different communication schemes and different communication overheads for the naming protocol. All these strategies are based on the facility that each process is allowed to survive after migration, even in its original site, only to provide a forwarding service to those communications that used its obsolete address.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Planning is a vital element of project management but it is still not recognized as a process variable. Its objective should be to outperform the initially defined processes, and foresee and overcome possible undesirable events. Detailed task-level master planning is unrealistic since one cannot accurately predict all the requirements and obstacles before work has even started. The process planning methodology (PPM) has thus been developed in order to overcome common problems of the overwhelming project complexity. The essential elements of the PPM are the process planning group (PPG), including a control team that dynamically links the production/site and management, and the planning algorithm embodied within two continuous-improvement loops. The methodology was tested on a factory project in Slovenia and in four successive projects of a similar nature. In addition to a number of improvement ideas and enhanced communication, the applied PPM resulted in 32% higher total productivity, 6% total savings and created a synergistic project environment.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The availability of a network strongly depends on the frequency of service outages and the recovery time for each outage. The loss of network resources includes complete or partial failure of hardware and software components, power outages, scheduled maintenance such as software and hardware, operational errors such as configuration errors and acts of nature such as floods, tornadoes and earthquakes. This paper proposes a practical approach to the enhancement of QoS routing by means of providing alternative or repair paths in the event of a breakage of a working path. The proposed scheme guarantees that every Protected Node (PN) is connected to a multi-repair path such that no further failure or breakage of single or double repair paths can cause any simultaneous loss of connectivity between an ingress node and an egress node. Links to be protected in an MPLS network are predefined and an LSP request involves the establishment of a working path. The use of multi-protection paths permits the formation of numerous protection paths allowing greater flexibility. Our analysis will examine several methods including single, double and multi-repair routes and the prioritization of signals along the protected paths to improve the Quality of Service (QoS), throughput, reduce the cost of the protection path placement, delay, congestion and collision.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This paper presents the on-going research performed in order to integrate process automation and process management support in the context of media production. This has been addressed on the basis of a holistic approach to software engineering applied to media production modelling to ensure design correctness, completeness and effectiveness. The focus of the research and development has been to enhance the metadata management throughout the process in a similar fashion to that achieved in Decision Support Systems (DSS) to facilitate well-grounded business decisions. The paper sets out the aims and objectives and the methodology deployed. The paper describes the solution in some detail and sets out some preliminary conclusions and the planned future work.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This paper addresses the need for accurate predictions on the fault inflow, i.e. the number of faults found in the consecutive project weeks, in highly iterative processes. In such processes, in contrast to waterfall-like processes, fault repair and development of new features run almost in parallel. Given accurate predictions on fault inflow, managers could dynamically re-allocate resources between these different tasks in a more adequate way. Furthermore, managers could react with process improvements when the expected fault inflow is higher than desired. This study suggests software reliability growth models (SRGMs) for predicting fault inflow. Originally developed for traditional processes, the performance of these models in highly iterative processes is investigated. Additionally, a simple linear model is developed and compared to the SRGMs. The paper provides results from applying these models on fault data from three different industrial projects. One of the key findings of this study is that some SRGMs are applicable for predicting fault inflow in highly iterative processes. Moreover, the results show that the simple linear model represents a valid alternative to the SRGMs, as it provides reasonably accurate predictions and performs better in many cases.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Models play a vital role in supporting a range of activities in numerous domains. We rely on models to support the design, visualisation, analysis and representation of parts of the world around us, and as such significant research effort has been invested into numerous areas of modelling; including support for model semantics, dynamic states and behaviour, temporal data storage and visualisation. Whilst these efforts have increased our capabilities and allowed us to create increasingly powerful software-based models, the process of developing models, supporting tools and /or data structures remains difficult, expensive and error-prone. In this paper we define from literature the key factors in assessing a model’s quality and usefulness: semantic richness, support for dynamic states and object behaviour, temporal data storage and visualisation. We also identify a number of shortcomings in both existing modelling standards and model development processes and propose a unified generic process to guide users through the development of semantically rich, dynamic and temporal models.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Season-long monitoring of on-farm rice (Oryza sativa, L.) plots in Nepal explored farmers' decision-making process on the deployment of varieties to agroecosystems, application of production inputs to varieties, agronomic practices and relationship between economic return and area planted per variety. Farmers deploy varieties [landraces (LRs) and modern varieties (MVs)] to agroecosystems based on their understanding of characteristics of varieties and agroecosystems, and the interaction between them. In marginal growing conditions, LRs can compete with MVs. Within an agroecosystem, economic return and area planted to varieties have positive relationship, but this is not so between agroecosystems. LRs are very diverse on agronomic and economic traits; therefore, they cannot be rejected a priori as inferior materials without proper evaluation. LRs have to be evaluated for useful traits and utilized in breeding programmes to generate farmer-preferred materials for marginal environments and for their conservation on-farm.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

A new electronic software distribution (ESD) life cycle analysis (LCA)methodology and model structure were constructed to calculate energy consumption and greenhouse gas (GHG) emissions. In order to counteract the use of high level, top-down modeling efforts, and to increase result accuracy, a focus upon device details and data routes was taken. In order to compare ESD to a relevant physical distribution alternative,physical model boundaries and variables were described. The methodology was compiled from the analysis and operational data of a major online store which provides ESD and physical distribution options. The ESD method included the calculation of power consumption of data center server and networking devices. An in-depth method to calculate server efficiency and utilization was also included to account for virtualization and server efficiency features. Internet transfer power consumption was analyzed taking into account the number of data hops and networking devices used. The power consumed by online browsing and downloading was also factored into the model. The embedded CO2e of server and networking devices was proportioned to each ESD process. Three U.K.-based ESD scenarios were analyzed using the model which revealed potential CO2e savings of 83% when ESD was used over physical distribution. Results also highlighted the importance of server efficiency and utilization methods.