778 resultados para Development Process


Relevância:

60.00% 60.00%

Publicador:

Resumo:

Dissertação de Doutoramento em Estudos da Criança, apresentada ao Instituto de Estudos da Criança, Universidade do Minho,sob orientação da Professora Doutora Júlia Oliveira Formosinho. Disponível em http://hdl.handle.net/1822/7085

Relevância:

60.00% 60.00%

Publicador:

Resumo:

Résumé : L'épargne et le crédit sont reconnus comme deux éléments clés du développement économique. Or, jusqu'à ce que les membres défavorisés d'une communauté aient accès aux ressources et services financiers, ils seront toujours privés de la participation au processus du développement et des bénéfices qui pourraient s'en suivre. La recherche indique que les services des prêts offerts par les institutions officielles ne parviennent que rarement aux plus pauvres de la société, qui sont obligés par conséquent de dépendre des intermédiaires informels comme les groupes d'épargne et les usuriers. Diverses organisations sur place comme les coopératives ont essayé de répondre aux besoins du développement des communautés défavorisées. Dans ce contexte, nous ferons d'abord le bilan historique et international des coopératives d'épargne et de crédit (i.e. les caisses populaires). Ensuite, nous analyserons quatre autres tentatives récentes qui eurent pour but de créer de nouvelles formes d'institutions financières, de les développer de telle sorte qu'elles offrent un degré d'accès raisonnable, sinon privilégié, aux ménages de revenu inférieur. L'analyse de ces cas-ci (venant du Zimbabwe, de l'Inde, du Ghana, et du Bangladesh) permettra d'identifier leurs caractéristiques communes et divergentes. À partir des résultats de cette analyse, un projet pilote au Zimbabwe fut initié pour élaborer une stratégie appropriée qui faciliterait le développement d'un réseau de caisses rurales. L'analyse théorique, la mise en pratique du projet, ainsi que les conclusions subséquentes soulignent l'importance de la participation directe des communautés à l'élaboration des organisations populaires. Il est évident que ces méthodes sont de loin plus efficaces que celles basées sur des politiques et des structures uniformes et compréhensives.||Abstract : Savings and credit are recognized as key elements of economic development, but until such time as disadvantaged members of the community have access to financial resources and services, they are obstructed from participating fully in the development process. Experience has shown that formal institutional credit bas rarely reached the poorer sectors of society, who have had to rely on informal intermediaries such as savings groups and money-lenders. Local organizations such as co-operatives have attempted to respond to the development needs of disadvantaged communities, and the historical and international record of savings and credit co-operatives (i.e. credit unions) is examined in this context. Four recent initiatives to design and develop new forms of financial institutions that give fair if not favoured access to low-income housebolds are also identified. These cases (from Zimbabwe, India, Bangladesh, and Ghana) are examined in an effort to identify common and divergent characteristics. Following from this analysis, a pilot project in Zimbabwe was initiated in an effort to elaborate an appropriate strategy for development of a network of rural savings and credit organization. The theoretical analysis, field exercise and subsequent reflections highlight the need for participatory methods of organizational design and development, rather than any all-encompassing structural or policy guidelines.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

With the increasing complexity of today's software, the software development process is becoming highly time and resource consuming. The increasing number of software configurations, input parameters, usage scenarios, supporting platforms, external dependencies, and versions plays an important role in expanding the costs of maintaining and repairing unforeseeable software faults. To repair software faults, developers spend considerable time in identifying the scenarios leading to those faults and root-causing the problems. While software debugging remains largely manual, it is not the case with software testing and verification. The goal of this research is to improve the software development process in general, and software debugging process in particular, by devising techniques and methods for automated software debugging, which leverage the advances in automatic test case generation and replay. In this research, novel algorithms are devised to discover faulty execution paths in programs by utilizing already existing software test cases, which can be either automatically or manually generated. The execution traces, or alternatively, the sequence covers of the failing test cases are extracted. Afterwards, commonalities between these test case sequence covers are extracted, processed, analyzed, and then presented to the developers in the form of subsequences that may be causing the fault. The hypothesis is that code sequences that are shared between a number of faulty test cases for the same reason resemble the faulty execution path, and hence, the search space for the faulty execution path can be narrowed down by using a large number of test cases. To achieve this goal, an efficient algorithm is implemented for finding common subsequences among a set of code sequence covers. Optimization techniques are devised to generate shorter and more logical sequence covers, and to select subsequences with high likelihood of containing the root cause among the set of all possible common subsequences. A hybrid static/dynamic analysis approach is designed to trace back the common subsequences from the end to the root cause. A debugging tool is created to enable developers to use the approach, and integrate it with an existing Integrated Development Environment. The tool is also integrated with the environment's program editors so that developers can benefit from both the tool suggestions, and their source code counterparts. Finally, a comparison between the developed approach and the state-of-the-art techniques shows that developers need only to inspect a small number of lines in order to find the root cause of the fault. Furthermore, experimental evaluation shows that the algorithm optimizations lead to better results in terms of both the algorithm running time and the output subsequence length.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

Recent studies concerning the landscape have investigated the most important activities that contribute for its modification and have tried to better understand the society through the marks left by its quotidian. It is understood that singular landscapes constitute the cultural patrimonies of the cities, once they are part of the daily life of the citizens and are present in their social representations. Some contemporary authors defend the preservation of the natural and urban landscape trying, specially, to keep its importance for the local population. Natal is a city where the ambient qualities are well defined and known by the beauty of the area where it is located. Situated just between a river and the sea, the city grew following its geographic characteristics. The Potengi River, the Atlantic Ocean and the vast dunes ecosystem represented natural limits to the urban expansion; at the same time they have favored the development of a landscape pattern marked by the dialectic between the natural elements and the human interventions. However, this relationship changed after the intensification of the high rising development process that took place since the 1960s. The urban legislation tried to preserve the features of the local landscape delimiting Areas for Controlling Building High , destined to protect the scenic value of some parts of the city. On the other hand, the civil construction sector has made constant pressure in sense to abolish or to modify this legal instrument, aiming profits that have increased, in the 1990s, because of the consumption and the qualification of the urban space for tourist activities. It is necessary the raising of new elements to stimulate the quarrel about the landscape preservation, the process of the urban space production and the best way for the legislation implementation. This work tries to raise elements about the subject at local level, in sense to use Natal City experience to contribute for the formulation of indicators to raise the question about the lack of measure for subjective values, for example the cultural and affective value of the landscape. The natural elements inserted in the urban profile, represent strong visual references and supply identity to the town; they are part of the collective imaginary and are detached in the social context of the city. Then, why the preservation of the landscape, that estimates the improvement in the quality of life, is not enough to justify the controlling building high already previewed as part of Natal City Legislation? These questions send us to the approach of the landscape, as a community patrimony, alerting that some of its significant esthetics attributes must be preserved as a legacy for the future generations

Relevância:

60.00% 60.00%

Publicador:

Resumo:

The digital revolution of the 21st century contributed to stem the Internet of Things (IoT). Trillions of embedded devices using the Internet Protocol (IP), also called smart objects, will be an integral part of the Internet. In order to support such an extremely large address space, a new Internet Protocol, called Internet Protocol Version 6 (IPv6) is being adopted. The IPv6 over Low Power Wireless Personal Area Networks (6LoWPAN) has accelerated the integration of WSNs into the Internet. At the same time, the Constrained Application Protocol (CoAP) has made it possible to provide resource constrained devices with RESTful Web services functionalities. This work builds upon previous experience in street lighting networks, for which a proprietary protocol, devised by the Lighting Living Lab, was implemented and used for several years. The proprietary protocol runs on a broad range of lighting control boards. In order to support heterogeneous applications with more demanding communication requirements and to improve the application development process, it was decided to port the Contiki OS to the four channel LED driver (4LD) board from Globaltronic. This thesis describes the work done to adapt the Contiki OS to support the Microchip TM PIC24FJ128GA308 microprocessor and presents an IP based solution to integrate sensors and actuators in smart lighting applications. Besides detailing the system’s architecture and implementation, this thesis presents multiple results showing that the performance of CoAP based resource retrievals in constrained nodes is adequate for supporting networking services in street lighting networks.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

Capacity analysis using simulation is not a new thing in literature. Most of the development process of UMTS standardization have used simulation tools; however, we thing that the use of GIS planning tools and matrix manipulation capacity of MATLAB can show us different scenarios and make a more realistic analysis. Some work is been doing in COST 273 in order to have more realistic scenarios for UMTS planning. Our work initially was centered in uplink analysis, but we are now working in downlink analysis, specifically in two areas: capacity in number of users for RT and NRT services, and Node B power. In this work we will show results for up-link capacity and some results for downlink capacity and BS power consumption.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

Abstract – Background – The software effort estimation research area aims to improve the accuracy of this estimation in software projects and activities. Aims – This study describes the development and usage of a web application tocollect data generated from the Planning Poker estimation process and the analysis of the collected data to investigate the impact of revising previous estimates when conducting similar estimates in a Planning Poker context. Method – Software activities were estimated by Universidade Tecnológica Federal do Paraná (UTFPR) computer students, using Planning Poker, with and without revising previous similar activities, storing data regarding the decision-making process. And the collected data was used to investigate the impact that revising similar executed activities have in the software effort estimates' accuracy.Obtained Results – The UTFPR computer students were divided into 14 groups. Eight of them showed accuracy increase in more than half of their estimates. Three of them had almost the same accuracy in more than half of their estimates. And only three of them had loss of accuracy in more than half of their estimates. Conclusion – Reviewing the similar executed software activities, when using Planning Poker, led to more accurate software estimates in most cases, and, because of that, can improve the software development process.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

Tässä tutkimuksessa tarkastellaan asiakkaan ja valmistajan osallistumista yhteistoiminnalliseen innovaatio- ja tuotekehitysprosessiin ja sen vaikutuksia yritysten kilpailukykyyn. Asiakaslähtöinen avoin innovaatio- ja tuotekehitystoiminta on yksi merkittävä kilpailuedun lähde, sillä se mahdollistaa tuotevariaatioiden nopeamman lanseerauksen ja asiakastarpeiden huomioimisen tuotekehitystoiminnassa. Tämän empiirisen tutkimuksen tavoitteena oli selvittää millä tavoin kulutustavaroihin kuuluvien muoti- ja sportvaatteiden suunnittelussa yhteistoiminnallinen innovaatio- ja tuotekehitystoiminta on mahdollista toteuttaa ja miten sitä voidaan hallita. Tutkimus toteutettiin kvalitatiivisena tapaustutkimuksena. Empiirinen aineisto kerättiin teemahaastatteluina tapausorganisaatiosta. Yhteistoiminnallisesta kehittämisestä ei ole olemassa yhtä kattavaa teoriaa, jota voitaisiin hyödyntää. Tämän vuoksi yritysten on luotava itse yhteistoiminnallisen innovaation ja tuotekehityksen viitekehys, joka parhaiten palvelee liiketoiminnan tarpeita. Viitekehys ja käytettävät menetelmät sekä yhteistoiminnallisuutta tukevat työvälineet ovat riippuvaisia mm. toimialasta ja sen markkinatilanteesta, liiketoimintamalleista, yrityksen toimintaympäristöstä, asiakkaista ja kehittämisen kohteeksi valitusta tuotealueesta. Empiirinen tutkimus osoittaa, että yhteistoiminnallinen innovaatio voi olla kilpailuedun lähde sekä valmistajille että asiakkaalle, mutta vaatii aina yrityskohtaista sopeuttamista. Tutkimuksen mukaan hyvin johdettu yhteistyömalli, oikeat tuotevalinnat ja asiakasinformaation monipuolinen hyödyntäminen vaikuttavat positiivisesti kehittämisprojektin tuloksiin.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

Sustainability in software system is still a new practice that most software developers and companies are trying to incorporate into their software development lifecycle and has been largely discussed in academia. Sustainability is a complex concept viewed from economic, environment and social dimensions with several definitions proposed making sometimes the concept of sustainability very fuzzy and difficult to apply and assess in software systems. This has hindered the adoption of sustainability in the software industry. A little research explores sustainability as a quality property of software products and services to answer questions such as; How to quantify sustainability as a quality construct in the same way as other quality attributes such as security, usability and reliability? How can it be applied to software systems? What are the measures and measurement scale of sustainability? The Goal of this research is to investigate the definitions, perceptions and measurement of sustainability from the quality perspective. Grounded in the general theory of software measurement, the aim is to develop a method that decomposes sustainability in factors, criteria and metrics. The Result is a method to quantify and access sustainability of software systems while incorporating management and users concern. Conclusion: The method will empower the ability of companies to easily adopt sustainability while facilitating its integration to the software development process and tools. It will also help companies to measure sustainability of their software products from economic, environmental, social, individual and technological dimension.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

Relatório de estágio apresentado à Escola Superior de Educação de Paula Frassinetti para a obtenção de grau de mestre em Educação Pré-Escolar e em Ensino do 1º Ciclo do Ensino Básico

Relevância:

60.00% 60.00%

Publicador:

Resumo:

Due to the growth of design size and complexity, design verification is an important aspect of the Logic Circuit development process. The purpose of verification is to validate that the design meets the system requirements and specification. This is done by either functional or formal verification. The most popular approach to functional verification is the use of simulation based techniques. Using models to replicate the behaviour of an actual system is called simulation. In this thesis, a software/data structure architecture without explicit locks is proposed to accelerate logic gate circuit simulation. We call thus system ZSIM. The ZSIM software architecture simulator targets low cost SIMD multi-core machines. Its performance is evaluated on the Intel Xeon Phi and 2 other machines (Intel Xeon and AMD Opteron). The aim of these experiments is to: • Verify that the data structure used allows SIMD acceleration, particularly on machines with gather instructions ( section 5.3.1). • Verify that, on sufficiently large circuits, substantial gains could be made from multicore parallelism ( section 5.3.2 ). • Show that a simulator using this approach out-performs an existing commercial simulator on a standard workstation ( section 5.3.3 ). • Show that the performance on a cheap Xeon Phi card is competitive with results reported elsewhere on much more expensive super-computers ( section 5.3.5 ). To evaluate the ZSIM, two types of test circuits were used: 1. Circuits from the IWLS benchmark suit [1] which allow direct comparison with other published studies of parallel simulators.2. Circuits generated by a parametrised circuit synthesizer. The synthesizer used an algorithm that has been shown to generate circuits that are statistically representative of real logic circuits. The synthesizer allowed testing of a range of very large circuits, larger than the ones for which it was possible to obtain open source files. The experimental results show that with SIMD acceleration and multicore, ZSIM gained a peak parallelisation factor of 300 on Intel Xeon Phi and 11 on Intel Xeon. With only SIMD enabled, ZSIM achieved a maximum parallelistion gain of 10 on Intel Xeon Phi and 4 on Intel Xeon. Furthermore, it was shown that this software architecture simulator running on a SIMD machine is much faster than, and can handle much bigger circuits than a widely used commercial simulator (Xilinx) running on a workstation. The performance achieved by ZSIM was also compared with similar pre-existing work on logic simulation targeting GPUs and supercomputers. It was shown that ZSIM simulator running on a Xeon Phi machine gives comparable simulation performance to the IBM Blue Gene supercomputer at very much lower cost. The experimental results have shown that the Xeon Phi is competitive with simulation on GPUs and allows the handling of much larger circuits than have been reported for GPU simulation. When targeting Xeon Phi architecture, the automatic cache management of the Xeon Phi, handles and manages the on-chip local store without any explicit mention of the local store being made in the architecture of the simulator itself. However, targeting GPUs, explicit cache management in program increases the complexity of the software architecture. Furthermore, one of the strongest points of the ZSIM simulator is its portability. Note that the same code was tested on both AMD and Xeon Phi machines. The same architecture that efficiently performs on Xeon Phi, was ported into a 64 core NUMA AMD Opteron. To conclude, the two main achievements are restated as following: The primary achievement of this work was proving that the ZSIM architecture was faster than previously published logic simulators on low cost platforms. The secondary achievement was the development of a synthetic testing suite that went beyond the scale range that was previously publicly available, based on prior work that showed the synthesis technique is valid.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

Some authors have shown the need of understanding the technological structuring process in contemporary firms. From this perspective, the software industry is a very important element because it provides products and services directly to many organizations from many fields. In this case, the Brazilian software industry has some peculiarities that distinguish it from other industries located in developed countries, which makes its understanding even more relevant. There is evidence that local firms take different strategies and structural configurations to enter into a market naturally dominated by large multinational firms. Therefore, this study aims to understand not only the structural configurations assumed by domestic firms but also the dynamic and the process that lead to these different configurations. To do so, this PhD dissertation investigates the institutional environment, its entities and the isomorphic movements, by employing an exploratory, descriptive and explanatory multiple cases study. Eight software development companies from the Recife's information technology Cluster were visited. Also, a form was applied and an interview with one of the main firm s professional was conducted. Although the study is predominantly qualitative, part of the data was analyzed through charts and graphs, providing a companies and environment overview that was very useful to analysis done through the interviews interpretation. As a result, it was realized that companies are structured around hybrids business models from two ideal types of software development companies, which are: software factory and technology-based company. Regarding the development process, it was found that there is a balanced distribution between the traditional and agile development paradigm. Among the traditional methodologies, the Rational Unified Process (RUP) is predominant. The Scrum is the most used methodology among the organizations based on the Agile Manifesto's principles. Regarding the structuring process, each institutional entity acts in such way that generates different isomorphic pressure. Emphasis was given to entities such as customers, research agencies, clusters, market-leading businesses, public universities, incubators, software industry organizations, technology vendors, development tool suppliers and manager s school and background because they relate themselves in a close way with the software firms. About this relationship, a dual and bilateral influence was found. Finally, the structuring level of the organizational field has been also identified as low, which gives a chance to organizational actors of acting independently

Relevância:

60.00% 60.00%

Publicador:

Resumo:

Call Level Interfaces (CLI) play a key role in business tiers of relational and on some NoSQL database applications whenever a fine tune control between application tiers and the host databases is a key requirement. Unfortunately, in spite of this significant advantage, CLI are low level API, this way not addressing high level architectural requirements. Among the examples we emphasize two situations: a) the need to decouple or not to decouple the development process of business tiers from the development process of application tiers and b) the need to automatically adapt business tiers to new business and/or security needs at runtime. To tackle these CLI drawbacks, and simultaneously keep their advantages, this paper proposes an architecture relying on CLI from which multi-purpose business tiers components are built, herein referred to as Adaptable Business Tier Components (ABTC). Beyond the reference architecture, this paper presents a proof of concept based on Java and Java Database Connectivity (an example of CLI).

Relevância:

60.00% 60.00%

Publicador:

Resumo:

Call Level Interfaces (CLI) are low level API that play a key role in database applications whenever a fine tune control between application tiers and the host databases is a key requirement. Unfortunately, in spite of this significant advantage, CLI were not designed to address organizational requirements and contextual runtime requirements. Among the examples we emphasize the need to decouple or not to decouple the development process of business tiers from the development process of application tiers and also the need to automatically adapt to new business and/or security needs at runtime. To tackle these CLI drawbacks, and simultaneously keep their advantages, this paper proposes an architecture relying on CLI from which multi-purpose business tiers components are built, herein referred to as Adaptable Business Tier Components (ABTC). This paper presents the reference architecture for those components and a proof of concept based on Java and Java Database Connectivity (an example of CLI).

Relevância:

60.00% 60.00%

Publicador:

Resumo:

Tutkittu yritys on suomalainen maaleja ja lakkoja kansainvälisesti valmistava ja myyvä toimija. Yrityksessä otettiin vuonna 2010 käyttöön uudet tuotannon ja toimitusketjun tavoitteet ja suunnitelmat ja tämä tutkimus on osa tuota kokonaisvaltaista kehittämissuuntaa. Tutkimuksessa käsitellään tuotannon ja kunnossapidon tehokkuuden parantamis- ja mittaustyökalu OEE:tä ja tuotevaihtoaikojen pienentämiseen tarkoitettua SMED -työkalua. Työn teoriaosuus perustuu lähinnä akateemisiin julkaisuihin, mutta myös haastatteluihin, kirjoihin, internet sivuihin ja yhteen vuosikertomukseen. Empiriaosuudessa OEE:n käyttöönoton ongelmia ja onnistumista tutkittiin toistettavalla käyttäjäkyselyllä. OEE:n potentiaalia ja käyttöönottoa tutkittiin myös tarkastelemalla tuotanto- ja käytettävyysdataa, jota oli kerätty tuotantolinjalta. SMED:iä tutkittiin siihen perustuvan tietokoneohjelman avulla. SMED:iä tutkittiin teoreettisella tasolla, eikä sitä implementoitu vielä käytäntöön. Tutkimustuloksien mukaan OEE ja SMED sopivat hyvin esimerkkiyritykselle ja niissä on paljon potentiaalia. OEE ei ainoastaan paljasta käytettävyyshäviöiden määrää, mutta myös niiden rakenteen. OEE -tulosten avulla yritys voi suunnata rajalliset tuotannon ja kunnossapidon parantamisen resurssit oikeisiin paikkoihin. Työssä käsiteltävä tuotantolinja ei tuottanut mitään 56 % kaikesta suunnitellusta tuotantoajasta huhtikuussa 2016. Linjan pysähdyksistä ajallisesti 44 % johtui vaihto-, aloitus- tai lopetustöistä. Tuloksista voidaan päätellä, että käytettävyyshäviöt ovat vakava ongelma yrityksen tuotannontehokkuudessa ja vaihtotöiden vähentäminen on tärkeä kehityskohde. Vaihtoaikaa voitaisiin vähentää ~15 % yksinkertaisilla ja halvoilla SMED:illä löydetyillä muutoksilla työjärjestyksessä ja työkaluissa. Parannus olisi vielä suurempi kattavimmilla muutoksilla. SMED:in suurin potentiaali ei välttämättä ole vaihtoaikojen lyhentämisessä vaan niiden standardisoinnissa.