987 resultados para software implementation
Resumo:
Increase of computational power and emergence of new computer technologies led to popularity of local communications between personal trusted devices. By-turn, it led to emergence of security problems related to user data utilized in such communications. One of the main aspects of the data security assurance is security of software operating on mobile devices. The aim of this work was to analyze security threats to PeerHood, software intended for performing personal communications between mobile devices regardless of underlying network technologies. To reach this goal, risk-based software security testing was performed. The results of the testing showed that the project has several security vulnerabilities. So PeerHood cannot be considered as a secure software. The analysis made in the work is the first step towards the further implementation of PeerHood security mechanisms, as well as taking into account security in the development process of this project.
Resumo:
This bachelor’s thesis is a part of the research project realized in the summer 2011 in Lappeenranta University of Technology. The goal of the project was to develop an automation concept for controlling the electrically excited synchronous motor. Thesis concentrates on the implementation of the automation concept into the ABB’s AC500 programmable logic enviroment. The automation program was developed as a state machine with the ABB’s PS501 Control Builder software. For controlling the automation program is developed a fieldbus control and with CodeSys Visualization Tool a local control with control panel. The fieldbus control is done to correspond the ABB drives communication profile and the local control is implemented with a function block which feeds right control words into the statemachine. A field current control of the synchronous motor is realized as a method presented in doctoral thesis of Olli Pyrhönen (Pyrhönen 1998). The Method combines stator flux and torque based openloop control and power factor based feedback control.
Resumo:
ABSTRACTObjective:to analyze the implementation of a trauma registry in a university teaching hospital delivering care under the unified health system (SUS), and its ability to identify points for improvement in the quality of care provided.Methods:the data collection group comprised students from medicine and nursing courses who were holders of FAPESP scholarships (technical training 1) or otherwise, overseen by the coordinators of the project. The itreg (ECO Sistemas-RJ/SBAIT) software was used as the database tool. Several quality "filters" were proposed to select those cases for review in the quality control process.Results:data for 1344 trauma patients were input to the itreg database between March and November 2014. Around 87.0% of cases were blunt trauma patients, 59.6% had RTS>7.0 and 67% ISS<9. Full records were available for 292 cases, which were selected for review in the quality program. The auditing filters most frequently registered were laparotomy four hours after admission and drainage of acute subdural hematomas four hours after admission. Several points for improvement were flagged, such as control of overtriage of patients, the need to reduce the number of negative imaging exams, the development of protocols for achieving central venous access, and management of major TBI.Conclusion: the trauma registry provides a clear picture of the points to be improved in trauma patient care, however, there are specific peculiarities for implementing this tool in the Brazilian milieu.
Resumo:
Scrum is an agile project management approach that has been widely practiced in the software development projects. It has proven to increase quality, productivity, customer satisfaction, transparency and team morale among other benefits from its implementation. The concept of scrum is based on the concepts of incremental innovation strategies, lean manufacturing, kaizen, iterative development and so on and is usually contrasted with the linear development models such as the waterfall method in the software industry. The traditional approaches to project management such as the waterfall method imply intensive upfront planning and approval of the entire project. These sort of approaches work well in the well-defined stable environments where all the specifications of the project are known in the beginning. However, in the uncertain environments when a project requires continuous development and incorporation of new requirements, they do not tend to work well. The scrum framework was inspiraed by Nonaka’s article about new product developement and was later adopted by software development practitioners. This research explores conditions for and benefits of the application of scrum framework beyond software development projects. There are currently a few case studies on the scrum implementation in non-software projects, but there is a noticeable trend of it in the scrum practitioners’ community. The research is based on the real-life context multiple case study analysis of three different non-software projects. The results of the research showed that in order to succeed within scrum projects need to satisfy certain conditions – necessary and sufficient. Among them the key factors are uncertainty of the project environment, not well defined outcomes, commitment of the scrum teams and management support. The top advantages of scrum implementation identified in the present research include improved transparency, accountability, team morale, communications, cooperation and collaboration. Further researches are advised to be carried out in order to validate these findings on a larger sample and to focus on more specific areas of scrum project management implementation.
Resumo:
Adapting and scaling up agile concepts, which are characterized by iterative, self-directed, customer value focused methods, may not be a simple endeavor. This thesis concentrates on studying challenges in a large-scale agile software development transformation in order to enhance understanding and bring insight into the underlying factors for such emerging challenges. This topic is approached through understanding the concepts of agility and different methods compared to traditional plan-driven processes, complex adaptive theory and the impact of organizational culture on agile transformational efforts. The empirical part was conducted by a qualitative case study approach. The internationally operating software development case organization had a year of experience of an agile transformation effort during it had also undergone organizational realignment efforts. The primary data collection was conducted through semi-structured interviews supported by participatory observation. As a result the identified challenges were categorized under four broad themes: organizational, management, team dynamics and process related. The identified challenges indicate that agility is a multifaceted concept. Agile practices may bring visibility in issues of which many are embedded in the organizational culture or in the management style. Viewing software development as a complex adaptive system could facilitate understanding of the underpinning philosophy and eventually solving the issues: interactions are more important than processes and solving a complex problem, such a novel software development, requires constant feedback and adaptation to changing requirements. Furthermore, an agile implementation seems to be unique in nature, and agents engaged in the interaction are the pivotal part of the success of achieving agility. In case agility is not a strategic choice for whole organization, it seems additional issues may arise due to different ways of working in different parts of an organization. Lastly, detailed suggestions to mitigate the challenges of the case organization are provided.
Resumo:
In the market where companies similar in size and resources are competing, it is challenging to have any advantage over others. In order to stay afloat company needs to have capability to perform with fewer resources and yet provide better service. Hence development of efficient processes which can cut costs and improve performance is crucial. As business expands, processes become complicated and large amount of data needs to be managed and available on request. Different tools are used in companies to store and manage data, which facilitates better production and transactions. In the modern business world the most utilized tool for that purpose is ERP - Enterprise Resource Planning system. The focus of this research is to study how competitive advantage can be achieved by implementing proprietary ERP system in the company; ERP system that is in-house created, tailor made to match and align business needs and processes. Market is full of ERP software, but choosing the right one is a big challenge. Identifying the key features that need improvement in processes and data management, choosing the right ERP, implementing it and the follow-up is a long and expensive journey companies undergo. Some companies prefer to invest in a ready-made package bought from vendor and adjust it according to own business needs, while others focus on creating own system with in-house IT capabilities. In this research a case company is used and author tries to identify and analyze why organization in question decided to pursue the development of proprietary ERP system, how it has been implemented and whether it has been successful. Main conclusion and recommendation of this research is for companies to know core capabilities and constraints before choosing and implementing ERP system. Knowledge of factors that affect system change outcome is important, to make the right decisions on strategic level and implement on operational level. Duration of the project in the case company has lasted longer than anticipated. It has been reported that in cases of buying ready product from vendor, projects are delayed and completed over budget as well. In general, in case company implementation of proprietary ERP has been successful both from business performance figures and usability of system by employees. In terms of future research, conducting a study to calculate statistically ROI of both approaches; of buying ready product and creating own ERP will be beneficial.
Resumo:
Due to various advantages such as flexibility, scalability and updatability, software intensive systems are increasingly embedded in everyday life. The constantly growing number of functions executed by these systems requires a high level of performance from the underlying platform. The main approach to incrementing performance has been the increase of operating frequency of a chip. However, this has led to the problem of power dissipation, which has shifted the focus of research to parallel and distributed computing. Parallel many-core platforms can provide the required level of computational power along with low power consumption. On the one hand, this enables parallel execution of highly intensive applications. With their computational power, these platforms are likely to be used in various application domains: from home use electronics (e.g., video processing) to complex critical control systems. On the other hand, the utilization of the resources has to be efficient in terms of performance and power consumption. However, the high level of on-chip integration results in the increase of the probability of various faults and creation of hotspots leading to thermal problems. Additionally, radiation, which is frequent in space but becomes an issue also at the ground level, can cause transient faults. This can eventually induce a faulty execution of applications. Therefore, it is crucial to develop methods that enable efficient as well as resilient execution of applications. The main objective of the thesis is to propose an approach to design agentbased systems for many-core platforms in a rigorous manner. When designing such a system, we explore and integrate various dynamic reconfiguration mechanisms into agents functionality. The use of these mechanisms enhances resilience of the underlying platform whilst maintaining performance at an acceptable level. The design of the system proceeds according to a formal refinement approach which allows us to ensure correct behaviour of the system with respect to postulated properties. To enable analysis of the proposed system in terms of area overhead as well as performance, we explore an approach, where the developed rigorous models are transformed into a high-level implementation language. Specifically, we investigate methods for deriving fault-free implementations from these models into, e.g., a hardware description language, namely VHDL.
Resumo:
Software is a key component in many of our devices and products that we use every day. Most customers demand not only that their devices should function as expected but also that the software should be of high quality, reliable, fault tolerant, efficient, etc. In short, it is not enough that a calculator gives the correct result of a calculation, we want the result instantly, in the right form, with minimal use of battery, etc. One of the key aspects for succeeding in today's industry is delivering high quality. In most software development projects, high-quality software is achieved by rigorous testing and good quality assurance practices. However, today, customers are asking for these high quality software products at an ever-increasing pace. This leaves the companies with less time for development. Software testing is an expensive activity, because it requires much manual work. Testing, debugging, and verification are estimated to consume 50 to 75 per cent of the total development cost of complex software projects. Further, the most expensive software defects are those which have to be fixed after the product is released. One of the main challenges in software development is reducing the associated cost and time of software testing without sacrificing the quality of the developed software. It is often not enough to only demonstrate that a piece of software is functioning correctly. Usually, many other aspects of the software, such as performance, security, scalability, usability, etc., need also to be verified. Testing these aspects of the software is traditionally referred to as nonfunctional testing. One of the major challenges with non-functional testing is that it is usually carried out at the end of the software development process when most of the functionality is implemented. This is due to the fact that non-functional aspects, such as performance or security, apply to the software as a whole. In this thesis, we study the use of model-based testing. We present approaches to automatically generate tests from behavioral models for solving some of these challenges. We show that model-based testing is not only applicable to functional testing but also to non-functional testing. In its simplest form, performance testing is performed by executing multiple test sequences at once while observing the software in terms of responsiveness and stability, rather than the output. The main contribution of the thesis is a coherent model-based testing approach for testing functional and performance related issues in software systems. We show how we go from system models, expressed in the Unified Modeling Language, to test cases and back to models again. The system requirements are traced throughout the entire testing process. Requirements traceability facilitates finding faults in the design and implementation of the software. In the research field of model-based testing, many new proposed approaches suffer from poor or the lack of tool support. Therefore, the second contribution of this thesis is proper tool support for the proposed approach that is integrated with leading industry tools. We o er independent tools, tools that are integrated with other industry leading tools, and complete tool-chains when necessary. Many model-based testing approaches proposed by the research community suffer from poor empirical validation in an industrial context. In order to demonstrate the applicability of our proposed approach, we apply our research to several systems, including industrial ones.
Resumo:
The review of intelligent machines shows that the demand for new ways of helping people in perception of the real world is becoming higher and higher every year. This thesis provides information about design and implementation of machine vision for mobile assembly robot. The work has been done as a part of LUT project in Laboratory of Intelligent Machines. The aim of this work is to create a working vision system. The qualitative and quantitative research were done to complete this task. In the first part, the author presents the theoretical background of such things as digital camera work principles, wireless transmission basics, creation of live stream, methods used for pattern recognition. Formulas, dependencies and previous research related to the topic are shown. In the second part, the equipment used for the project is described. There is information about the brands, models, capabilities and also requirements needed for implementation. Although, the author gives a description of LabVIEW software, its add-ons and OpenCV which are used in the project. Furthermore, one can find results in further section of considered thesis. They mainly represented by screenshots from cameras, working station and photos of the system. The key result of this thesis is vision system created for the needs of mobile assembly robot. Therefore, it is possible to see graphically what was done on examples. Future research in this field includes optimization of the pattern recognition algorithm. This will give less response time for recognizing objects. Presented by author system can be used also for further activities which include artificial intelligence usage.
Resumo:
Land use is a crucial link between human activities and the natural environment and one of the main driving forces of global environmental change. Large parts of the terrestrial land surface are used for agriculture, forestry, settlements and infrastructure. Given the importance of land use, it is essential to understand the multitude of influential factors and resulting land use patterns. An essential methodology to study and quantify such interactions is provided by the adoption of land-use models. By the application of land-use models, it is possible to analyze the complex structure of linkages and feedbacks and to also determine the relevance of driving forces. Modeling land use and land use changes has a long-term tradition. In particular on the regional scale, a variety of models for different regions and research questions has been created. Modeling capabilities grow with steady advances in computer technology, which on the one hand are driven by increasing computing power on the other hand by new methods in software development, e.g. object- and component-oriented architectures. In this thesis, SITE (Simulation of Terrestrial Environments), a novel framework for integrated regional sland-use modeling, will be introduced and discussed. Particular features of SITE are the notably extended capability to integrate models and the strict separation of application and implementation. These features enable efficient development, test and usage of integrated land-use models. On its system side, SITE provides generic data structures (grid, grid cells, attributes etc.) and takes over the responsibility for their administration. By means of a scripting language (Python) that has been extended by language features specific for land-use modeling, these data structures can be utilized and manipulated by modeling applications. The scripting language interpreter is embedded in SITE. The integration of sub models can be achieved via the scripting language or by usage of a generic interface provided by SITE. Furthermore, functionalities important for land-use modeling like model calibration, model tests and analysis support of simulation results have been integrated into the generic framework. During the implementation of SITE, specific emphasis was laid on expandability, maintainability and usability. Along with the modeling framework a land use model for the analysis of the stability of tropical rainforest margins was developed in the context of the collaborative research project STORMA (SFB 552). In a research area in Central Sulawesi, Indonesia, socio-environmental impacts of land-use changes were examined. SITE was used to simulate land-use dynamics in the historical period of 1981 to 2002. Analogous to that, a scenario that did not consider migration in the population dynamics, was analyzed. For the calculation of crop yields and trace gas emissions, the DAYCENT agro-ecosystem model was integrated. In this case study, it could be shown that land-use changes in the Indonesian research area could mainly be characterized by the expansion of agricultural areas at the expense of natural forest. For this reason, the situation had to be interpreted as unsustainable even though increased agricultural use implied economic improvements and higher farmers' incomes. Due to the importance of model calibration, it was explicitly addressed in the SITE architecture through the introduction of a specific component. The calibration functionality can be used by all SITE applications and enables largely automated model calibration. Calibration in SITE is understood as a process that finds an optimal or at least adequate solution for a set of arbitrarily selectable model parameters with respect to an objective function. In SITE, an objective function typically is a map comparison algorithm capable of comparing a simulation result to a reference map. Several map optimization and map comparison methodologies are available and can be combined. The STORMA land-use model was calibrated using a genetic algorithm for optimization and the figure of merit map comparison measure as objective function. The time period for the calibration ranged from 1981 to 2002. For this period, respective reference land-use maps were compiled. It could be shown, that an efficient automated model calibration with SITE is possible. Nevertheless, the selection of the calibration parameters required detailed knowledge about the underlying land-use model and cannot be automated. In another case study decreases in crop yields and resulting losses in income from coffee cultivation were analyzed and quantified under the assumption of four different deforestation scenarios. For this task, an empirical model, describing the dependence of bee pollination and resulting coffee fruit set from the distance to the closest natural forest, was integrated. Land-use simulations showed, that depending on the magnitude and location of ongoing forest conversion, pollination services are expected to decline continuously. This results in a reduction of coffee yields of up to 18% and a loss of net revenues per hectare of up to 14%. However, the study also showed that ecological and economic values can be preserved if patches of natural vegetation are conservated in the agricultural landscape. -----------------------------------------------------------------------
Resumo:
This article shows the work developed for adapting metadata conform to the official Colombian metadata standard NTC 4611 to the international standard ISO 19115. CatMDedit, an open source metadata editor, is used in this task. CatMDedit is able of import variants of CSDGM such as NTC 4611 and export to the stable version of ISO 19139 (the XML implementation model of ISO 19115)
Resumo:
En este trabajo se describe la solución ideada para la implantación de un Sistema de Información Geográfica que debe dar servicio al Instituto Universitario del Agua y del Medio Ambiente de la Universidad de Murcia y al Instituto Euromediterráneo del Agua. Dada la naturaleza de ambas instituciones, se trata de una herramienta orientada fundamentalmente al estudio de recursos hídricos y procesos hidrológicos. El proceso se inició con una identificación de las necesidades de los usuarios (con perfiles y requerimiento diferentes) y el posterior desarrollo del diseño conceptual que pudiera asegurar la satisfacción de estas necesidades. Debido a que los requerimientos de los usuarios así lo demandaban, se ha tenido en cuenta tanto a usuarios que trabajan en entorno linux como a otros que lo hacen en entorno windows. Se ha optado por un sistema basado en software libre utilizando GRASS para el manejo de información raster y modelización; postgis (sobre postgreSQL) y GRASS para la gestión de información vectorial; y QGIS, gvSIG y Kosmo como interfaces gráficas de usuario. Otros programas utilizados para propósitos específicos han sido R, Mapserver o GMT
Resumo:
We present a system for dynamic network resource configuration in environments with bandwidth reservation. The proposed system is completely distributed and automates the mechanisms for adapting the logical network to the offered load. The system is able to manage dynamically a logical network such as a virtual path network in ATM or a label switched path network in MPLS or GMPLS. The system design and implementation is based on a multi-agent system (MAS) which make the decisions of when and how to change a logical path. Despite the lack of a centralised global network view, results show that MAS manages the network resources effectively, reducing the connection blocking probability and, therefore, achieving better utilisation of network resources. We also include details of its architecture and implementation
Resumo:
Comparison of brand-leading 'Wiki' sofware packages with respect to implementation of Wiki-style electronic resource for handbooks.
Resumo:
La entrada en vigor del Tratado de Libre Comercio (TLC) con Estados Unidos representa para los empresarios colombianos la oportunidad de acceder al mercado más importante del mundo en una posición privilegiada, bajo la cual resulta más sencilla la colocación de productos en este pais para aquellas compañías con vocación exportadora. Sin embargo, la alta competencia y desarrollo de este mercado hace necesario que las empresas cuenten con información apropiada que les permita enfocar sus esfuerzos en productos, segmentos de mercado o Estados específicos donde puedan alcanzar la sostenibilidad y perdurabilidad en el tiempo, así como el desarrollo de nuevas posibilidades comerciales. Para tal fin, realizamos este trabajo de investigación con el cual se busca generar una herramienta informática que contenga información respecto al flujo comercial de Colombia hacia cada uno de los 50 estados de EE.UU., detallando en cada caso las oportunidades comerciales identificadas por partidas arancelarias; y que servirá de apoyo para aquellos empresarios colombianos que buscan beneficiarse de la nueva coyuntura comercial que ofrece el acuerdo bilateral.