991 resultados para software creation methodology
Resumo:
The traditional forest industry is a good example of the changing nature of the competitive environment in many industries. Faced with drastic challenges forestindustry companies are forced to search for new value-creating strategies in order to create competitive advantage. The emerging bioenergy business is now offering promising avenues for value creation for both the forest and energy sectors because of their complementary resources and knowledge with respect to bioenergy production from forest-based biomass. The key objective of this dissertation is to examine the sources of sustainable competitive advantage and the value-creation opportunities that are emerging at the intersection between the forest and energy industries. The research topic is considered from different perspectives in order to provide a comprehensive view of the phenomenon. The study discusses the business opportunities that are related to producing bioenergy from forest-based biomass, and sheds light on the greatest challenges and threats influencing the success of collaboration between the forest and energy sectors. In addition, it identifies existing and potential bioenergy actors, and considers the resources and capabilities needed in order to prosper in the bioenergy field. The value-creation perspective is founded on strategic management accounting, the theoretical frameworks are adopted from the field of strategic management, and the future aspect is taken into account through the application of futures studies research methodology. This thesis consists of two parts. The first part provides a synthesis of the overall dissertation, and the second part comprises four complementary research papers. There search setting is explorative in nature, and both qualitative and quantitative research methods are used. As a result, the thesis lays the foundation for non-technological studies on bioenergy. It gives an example of how to study new value-creation opportunities at an industrial intersection, and discusses the main determinants affecting the value-creation process. In order to accomplish these objectives the phenomenon of value creation at the intersection between the forest and energy industries is theorized and connected with the dynamic resource-based view of the firm.
Resumo:
ABSTRACT Permanent Preservation Areas (PPAs) along watercourses have been the focus of numerous studies, not only because of the fragility and ecological relevance of riverine vegetation, but also because of the inefficiency demonstrated in conforming to the legislation protecting it. One of the major difficulties encountered in terms of guaranteeing the effective conservation of these riverside areas is the absence of methodologies that can be used to define them rapidly and accurately without manually determining the widths of the rivers or assigning only uniform linear values for the entire watercourse. The present work sought to develop a spatial analysis methodology capable of automatically defining permanent preservation areas along watercourses using geographic information system (GIS) software. The present study was undertaken in the Sergipe River basin, "considering the river itself and its principal affluents. We used the database of the Digital Atlas of Hydrological Resources (SEMARH/SE), and the delimitations of the PPAs were performed using ArcGIS 10.1 and the XToolPro 9.0 extension. A total of 5,003.82 hectares of Permanent Preservation Areas were delimited along the margins of the rivers analyzed, with a margin of error of <1% in delimiting the widths of the rivers within the entire area considered. The methodology described here can be used to define PPAs efficiently, relatively rapidly, and with very small margins of error, thus representing a technological advance in terms of using GIS for land management.
Resumo:
The aim of this study was to group temporal profiles of 10-day composites NDVI product by similarity, which was obtained by the SPOT Vegetation sensor, for municipalities with high soybean production in the state of Paraná, Brazil, in the 2005/2006 cropping season. Data mining is a valuable tool that allows extracting knowledge from a database, identifying valid, new, potentially useful and understandable patterns. Therefore, it was used the methods for clusters generation by means of the algorithms K-Means, MAXVER and DBSCAN, implemented in the WEKA software package. Clusters were created based on the average temporal profiles of NDVI of the 277 municipalities with high soybean production in the state and the best results were found with the K-Means algorithm, grouping the municipalities into six clusters, considering the period from the beginning of October until the end of March, which is equivalent to the crop vegetative cycle. Half of the generated clusters presented spectro-temporal pattern, a characteristic of soybeans and were mostly under the soybean belt in the state of Paraná, which shows good results that were obtained with the proposed methodology as for identification of homogeneous areas. These results will be useful for the creation of regional soybean "masks" to estimate the planted area for this crop.
Resumo:
Business model in the context of international entrepreneurship is a rather new topic in academic literature. The objective of this thesis is to examine value creation through business models in internationally entrepreneurial firms. The study examines value creation through the two partner interfaces and the customer interface of a company. Central for the study is the consideration of also the partners’ incentives. Business model construct is studied by defining the concept, examining its elements and the relationship with strategy – concluding with value creation through the concept. The international entrepreneurship chapter focuses on internationally entrepreneurial firms, inspecting the drivers behind international entrepreneurship and studying value network concept. Value creation functions as a driving theme in the theory discussion. The empirical research of the study focuses on eight Finnish internationally entrepreneurial software companies. The study is conducted as a qualitative cross-case analysis building on the single case company business model analyses. The findings suggest that the business models of software companies incorporate vast similarities. However, the degree of international experience has influence on the companies’ value creation and the way they organize their activities both in upstream and downstream of the value chain.
Resumo:
Corporate decision to scale Agile Software development methodologies in offshoring environment has been obstructed due to possible challenges in scaling agile as agile methodologies are regarded to be suitable for small project and co-located team only. Although model such as Agile Scaling Model (ASM) has been developed for scaling Agile with different factors, inabilities of companies to figure out challenges and addressing them lead to failure of project rather than gaining the benefits of using agile methodologies. This failure can be avoided, when scaling agile in IT offshoring environment, by determining key challenges associated in scaling agile in IT offshoring environment and then preparing strategies for addressing those key challenges. These key challenges in scaling agile with IT offshoring environment can be determined by studying issues related with Offshoring and Agile individually and also considering the positive impact of agile methodology in offshoring environment. Then, possible strategies to tackle these key challenges are developed according to the nature of individual challenges and utilizing the benefits of different agile methodologies to address individual situation. Thus, in this thesis, we proposed strategy of using hybrid agile method, which is increasing trend due to adaptive nature of Agile. Determination of the key challenges and possible strategies for tackling those challenges are supported with the survey conducted in the researched organization.
Resumo:
Consumers create a great deal of content in the Internet. As they do not get a monetary compensation for doing so, it seems apparent that other types of reward are derived from giving up one's time and other resources. The purpose of this study is to describe value creation and user participation in a virtual community. It can be broken down into three research questions. 1. What is the value creation logic of a virtual community? 2. What value is perceived by virtual community users? 3. What is the association between value perceived by virtual community users and their participation in a community? The study employs the discussion on value co-creation as well as perspectives on the notion of value for consumers to create a theoretical framework for value creation. To understand value creation in the context of virtual communities and to create a theoretical framework for user participation, existing literature and research on virtual communities is discussed. The empirical part of the study employs quantitative methodology to analyze data collected by sending a survey questionnaire to the users of a Finnish wellbeing-based virtual community. The results indicate that virtual community users perceive self-development, enjoyment, reputation-building and community commitment value when using the service and that value perceptions are associated with community participation. Moreover, it was found that different types of value are associated with different forms of participation. Based on the findings, it is suggested that the four types of value make up a considerable share of value for virtual community users. Moreover, as the results indicate that different value types are associated with different forms of participation, it suggested that virtual community organizers consider what forms of participation they want to promote and design their virtual communities to support creation of the different types of value accordingly.
Resumo:
Fifty Bursa of Fabricius (BF) were examined by conventional optical microscopy and digital images were acquired and processed using Matlab® 6.5 software. The Artificial Neuronal Network (ANN) was generated using Neuroshell® Classifier software and the optical and digital data were compared. The ANN was able to make a comparable classification of digital and optical scores. The use of ANN was able to classify correctly the majority of the follicles, reaching sensibility and specificity of 89% and 96%, respectively. When the follicles were scored and grouped in a binary fashion the sensibility increased to 90% and obtained the maximum value for the specificity of 92%. These results demonstrate that the use of digital image analysis and ANN is a useful tool for the pathological classification of the BF lymphoid depletion. In addition it provides objective results that allow measuring the dimension of the error in the diagnosis and classification therefore making comparison between databases feasible.
Resumo:
The recent emergence of a new generation of mobile application marketplaces has changed the business in the mobile ecosystems. The marketplaces have gathered over a million applications by hundreds of thousands of application developers and publishers. Thus, software ecosystems—consisting of developers, consumers and the orchestrator—have emerged as a part of the mobile ecosystem. This dissertation addresses the new challenges faced by mobile application developers in the new ecosystems through empirical methods. By using the theories of two-sided markets and business ecosystems as the basis, the thesis assesses monetization and value creation in the market as well as the impact of electronic Word-of-Mouth (eWOM) and developer multihoming— i. e. contributing for more than one platform—in the ecosystems. The data for the study was collected with web crawling from the three biggest marketplaces: Apple App Store, Google Play and Windows Phone Store. The dissertation consists of six individual articles. The results of the studies show a gap in monetization among the studied applications, while a majority of applications are produced by small or micro-enterprises. The study finds only weak support for the impact of eWOM on the sales of an application in the studied ecosystem. Finally, the study reveals a clear difference in the multi-homing rates between the top application developers and the rest. This has, as discussed in the thesis, an impact on the future market analyses—it seems that the smart device market can sustain several parallel application marketplaces.
Resumo:
The purpose of this thesis is to find out how customer co-creation activities are managed in Finnish high-tech SMEs by understanding managers’ views on relevant issues. According to theory, issues such as firm size, customer knowledge implementation, lead customers, the fuzzy front-end of product/service development as well as the reluctance to engage in customer co-creation are some of the field’s focal issues. The views of 145 Finnish SME managers on these issues were gathered as empirical evidence through an online questionnaire and analyzed with SPSS statistics software. The results show, firstly, that Finnish SME managers are aware of the issues associated with customer co-creation and are able to actively manage them. Additionally, managers performed well in regards to collaborating with lead customers and implemented customer knowledge evenly in various stages of their new product and service development processes. Intellectual property rights emerged as an obstacle deterring managers from engaging in co-creation. The results suggest that in practice managers would do well by looking for more opportunities to implement customer knowledge in the early and late stages of new product and service development, as well as by actively searching for lead customers.
Resumo:
The objective of this study was to analyze retinol equivalent and iron content in different food composition tables and nutritional evaluation software programs. A literature search was conduct to identify tables and software available in Brazil containing information about retinol equivalent and iron content that are currently used by nutritionists. Ten tables and five software programs were selected for this study. The methodology used to present the retinol equivalent and iron content was evaluated and no pattern to obtain such content was found in the tables and software programs analyzed. Only one of the tables had enough information for the calculation of retinol equivalents; this table is recommended to all Latin America As for the iron content, three of the tables analyzed stand out and therefore should be used; two of them are based on national foods and the other is recommended for use in all Latin America countries. None of the software programs evaluated use the conversion factors suggested by IVACG to assess the vitamin A content in foods. Special attention should be given to the content of iron provided in the software programs since they use tables as international sources and fortified foods.
Resumo:
Assessing fish consumption is complex and involves several factors; however, the use of questionnaires in surveys and the use of the Internet as tool to collect data have been considered promising approaches. Therefore, the objective of this research was to design a data collection technique using a questionnaire to assess fish consumption by making it available on a specific home page on the Internet. A bibliographical survey or review was carried out to identify the features of the instrument, and therefore pre-tests were conducted with previous instruments, followed by the Focus Group technique. Specialists then performed an analysis and conducted an online pre-test. Multivariate data analysis was applied using the SmartPLS software. The results indicate that 1.966 participants belonging to the University of São Paulo (USP) community participated in the test, and after the exclusion of some variables, a statistically significant results were obtained. The final constructs comprised consumption, quality, and general characteristics. The instrument consisted of behavioral statements in a 5-point Likert scale and multiple-choice questions. The Cronbach's alpha reliability coefficient was 0.66 for general characteristics, 0.98 for quality, and 0.91 for consumption, which indicate good reliability of the instrument. In conclusion, the results proved that the Internet assessment is efficient. The instrument of analysis allowed us to better understand the process of buying and consuming fish in the country, and it can be used as base for further research.
Resumo:
The number of security violations is increasing and a security breach could have irreversible impacts to business. There are several ways to improve organization security, but some of them may be difficult to comprehend. This thesis demystifies threat modeling as part of secure system development. Threat modeling enables developers to reveal previously undetected security issues from computer systems. It offers a structured approach for organizations to find and address threats against vulnerabilities. When implemented correctly threat modeling will reduce the amount of defects and malicious attempts against the target environment. In this thesis Microsoft Security Development Lifecycle (SDL) is introduced as an effective methodology for reducing defects in the target system. SDL is traditionally meant to be used in software development, principles can be however partially adapted to IT-infrastructure development. Microsoft threat modeling methodology is an important part of SDL and it is utilized in this thesis to find threats from the Acme Corporation’s factory environment. Acme Corporation is used as a pseudonym for a company providing high-technology consumer electronics. Target for threat modeling is the IT-infrastructure of factory’s manufacturing execution system. Microsoft threat modeling methodology utilizes STRIDE –mnemonic and data flow diagrams to find threats. Threat modeling in this thesis returned results that were important for the organization. Acme Corporation now has more comprehensive understanding concerning IT-infrastructure of the manufacturing execution system. On top of vulnerability related results threat modeling provided coherent views of the target system. Subject matter experts from different areas can now agree upon functions and dependencies of the target system. Threat modeling was recognized as a useful activity for improving security.
Resumo:
The goal of this thesis is to define and validate a software engineering approach for the development of a distributed system for the modeling of composite materials, based on the analysis of various existing software development methods. We reviewed the main features of: (1) software engineering methodologies; (2) distributed system characteristics and their effect on software development; (3) composite materials modeling activities and the requirements for the software development. Using the design science as a research methodology, the distributed system for creating models of composite materials is created and evaluated. Empirical experiments which we conducted showed good convergence of modeled and real processes. During the study, we paid attention to the matter of complexity and importance of distributed system and a deep understanding of modern software engineering methods and tools.
Resumo:
Despite the growing popularity of participatory video as a tool for facilitating youth empowerment, the methodology and impacts of the practice are extremely understudied. This paper describes a study design created to examine youth media methodology and the ethical dilemmas that arose in its attempted implementation. Specifically, elements that added “rigor” to the study (i.e., randomization, pre- and post-measures, and an intensive interview) conflicted with the fundamental tenets of youth participation. The paper concludes with suggestions for studying participatory media methodologies that are more in line with an ethics of participation.
Resumo:
Les sociétés modernes dépendent de plus en plus sur les systèmes informatiques et ainsi, il y a de plus en plus de pression sur les équipes de développement pour produire des logiciels de bonne qualité. Plusieurs compagnies utilisent des modèles de qualité, des suites de programmes qui analysent et évaluent la qualité d'autres programmes, mais la construction de modèles de qualité est difficile parce qu'il existe plusieurs questions qui n'ont pas été répondues dans la littérature. Nous avons étudié les pratiques de modélisation de la qualité auprès d'une grande entreprise et avons identifié les trois dimensions où une recherche additionnelle est désirable : Le support de la subjectivité de la qualité, les techniques pour faire le suivi de la qualité lors de l'évolution des logiciels, et la composition de la qualité entre différents niveaux d'abstraction. Concernant la subjectivité, nous avons proposé l'utilisation de modèles bayésiens parce qu'ils sont capables de traiter des données ambiguës. Nous avons appliqué nos modèles au problème de la détection des défauts de conception. Dans une étude de deux logiciels libres, nous avons trouvé que notre approche est supérieure aux techniques décrites dans l'état de l'art, qui sont basées sur des règles. Pour supporter l'évolution des logiciels, nous avons considéré que les scores produits par un modèle de qualité sont des signaux qui peuvent être analysés en utilisant des techniques d'exploration de données pour identifier des patrons d'évolution de la qualité. Nous avons étudié comment les défauts de conception apparaissent et disparaissent des logiciels. Un logiciel est typiquement conçu comme une hiérarchie de composants, mais les modèles de qualité ne tiennent pas compte de cette organisation. Dans la dernière partie de la dissertation, nous présentons un modèle de qualité à deux niveaux. Ces modèles ont trois parties: un modèle au niveau du composant, un modèle qui évalue l'importance de chacun des composants, et un autre qui évalue la qualité d'un composé en combinant la qualité de ses composants. L'approche a été testée sur la prédiction de classes à fort changement à partir de la qualité des méthodes. Nous avons trouvé que nos modèles à deux niveaux permettent une meilleure identification des classes à fort changement. Pour terminer, nous avons appliqué nos modèles à deux niveaux pour l'évaluation de la navigabilité des sites web à partir de la qualité des pages. Nos modèles étaient capables de distinguer entre des sites de très bonne qualité et des sites choisis aléatoirement. Au cours de la dissertation, nous présentons non seulement des problèmes théoriques et leurs solutions, mais nous avons également mené des expériences pour démontrer les avantages et les limitations de nos solutions. Nos résultats indiquent qu'on peut espérer améliorer l'état de l'art dans les trois dimensions présentées. En particulier, notre travail sur la composition de la qualité et la modélisation de l'importance est le premier à cibler ce problème. Nous croyons que nos modèles à deux niveaux sont un point de départ intéressant pour des travaux de recherche plus approfondis.