947 resultados para component-based software development
Resumo:
Some authors have shown the need of understanding the technological structuring process in contemporary firms. From this perspective, the software industry is a very important element because it provides products and services directly to many organizations from many fields. In this case, the Brazilian software industry has some peculiarities that distinguish it from other industries located in developed countries, which makes its understanding even more relevant. There is evidence that local firms take different strategies and structural configurations to enter into a market naturally dominated by large multinational firms. Therefore, this study aims to understand not only the structural configurations assumed by domestic firms but also the dynamic and the process that lead to these different configurations. To do so, this PhD dissertation investigates the institutional environment, its entities and the isomorphic movements, by employing an exploratory, descriptive and explanatory multiple cases study. Eight software development companies from the Recife's information technology Cluster were visited. Also, a form was applied and an interview with one of the main firm s professional was conducted. Although the study is predominantly qualitative, part of the data was analyzed through charts and graphs, providing a companies and environment overview that was very useful to analysis done through the interviews interpretation. As a result, it was realized that companies are structured around hybrids business models from two ideal types of software development companies, which are: software factory and technology-based company. Regarding the development process, it was found that there is a balanced distribution between the traditional and agile development paradigm. Among the traditional methodologies, the Rational Unified Process (RUP) is predominant. The Scrum is the most used methodology among the organizations based on the Agile Manifesto's principles. Regarding the structuring process, each institutional entity acts in such way that generates different isomorphic pressure. Emphasis was given to entities such as customers, research agencies, clusters, market-leading businesses, public universities, incubators, software industry organizations, technology vendors, development tool suppliers and manager s school and background because they relate themselves in a close way with the software firms. About this relationship, a dual and bilateral influence was found. Finally, the structuring level of the organizational field has been also identified as low, which gives a chance to organizational actors of acting independently
Resumo:
This paper presents how new paradigms and methodologies for software development are changing rapidly in the last two years. In the current scenario where we live on, occurs a transition that, although slight, reflects the rapid manner in which the software production paradigms are reinvented due to the change of display devices and interaction with the end user. Studies indicate that in 2013 was the turn out of the internet access domain for mobile devices over the traditional desktop device, which is currently at around 60% mobile, against 40% desktop. This field will tend to grow in the coming years and it is expected that the use of internet for a desktop terminal tends to be less each day (comScore). In this context, the software industry has been re-invented and updated with respect to technologies that promote software and mobile applications, building products capable of responding to the user market. The development of software products, such as applications, must be put into production for different user environments, such as Web, iOS and Android in a way to enhance efficiency, optimization and productivity in the software development cycle (Langer, Arthur M.).
Resumo:
The low-frequency electromagnetic compatibility (EMC) is an increasingly important aspect in the design of practical systems to ensure the functional safety and reliability of complex products. The opportunities for using numerical techniques to predict and analyze system’s EMC are therefore of considerable interest in many industries. As the first phase of study, a proper model, including all the details of the component, was required. Therefore, the advances in EMC modeling were studied with classifying analytical and numerical models. The selected model was finite element (FE) modeling, coupled with the distributed network method, to generate the model of the converter’s components and obtain the frequency behavioral model of the converter. The method has the ability to reveal the behavior of parasitic elements and higher resonances, which have critical impacts in studying EMI problems. For the EMC and signature studies of the machine drives, the equivalent source modeling was studied. Considering the details of the multi-machine environment, including actual models, some innovation in equivalent source modeling was performed to decrease the simulation time dramatically. Several models were designed in this study and the voltage current cube model and wire model have the best result. The GA-based PSO method is used as the optimization process. Superposition and suppression of the fields in coupling the components were also studied and verified. The simulation time of the equivalent model is 80-100 times lower than the detailed model. All tests were verified experimentally. As the application of EMC and signature study, the fault diagnosis and condition monitoring of an induction motor drive was developed using radiated fields. In addition to experimental tests, the 3DFE analysis was coupled with circuit-based software to implement the incipient fault cases. The identification was implemented using ANN for seventy various faulty cases. The simulation results were verified experimentally. Finally, the identification of the types of power components were implemented. The results show that it is possible to identify the type of components, as well as the faulty components, by comparing the amplitudes of their stray field harmonics. The identification using the stray fields is nondestructive and can be used for the setups that cannot go offline and be dismantled
Resumo:
Software is an important infrastructural component of scientific research practice. The work of research often requires scientists to develop, use, and share software in order to address their research questions. This report presents findings from a survey of researchers at the University of Washington in three broad areas: Oceanography, Biology, and Physics. This survey is part of the National Science Foundation funded study Scientists and their Software: A Sociotechnical Investigation of Scientific Software Development and Sharing (ACI-1302272). We inquired about each respondent’s research area and data use along with their use, development, and sharing of software. Finally, we asked about challenges researchers face with and about concerns regarding software’s effect on study replicability. These findings are part of ongoing efforts to develop deeper characterizations of the role of software in twenty-first century scientific research.
Resumo:
This thesis work aims to produce and test multilayer electrodes for their use as photocathode in a PEC device. The electrode developed is based on CIGS, a I-III-VI2 semiconductor material composed of copper (Cu), indium (In), Gallium (Ga) and selenium (Se). It has a bandgap in the range of 1.0-2.4 eV and an absorption coefficient of about 105cm−1, which makes it a promising photocathode for PEC water splitting. The idea of our multilayer electrode is to deposit a thin layer of CdS on top of CIGS to form a solid-state p–n junction and lead to more efficient charge separation. In addition another thin layer of AZO (Aluminum doped zinc oxide) is deposit on top of CdS since it would form a better alignment between the AZO/CdS/CIGS interfaces, which would help to drive the charge transport further and minimize charge recombination. Finally, a TiO2 layer on top of the electrodes is used as protective layer during the H2 evolution. FTO (Fluorine doped tin oxide) and Molybdenum are used as back-contact. We used the technique of RF magnetron sputtering to deposit the thin layers of material. The structural characterization performed by XDR measurement confirm a polycrystalline chalcopyrite structural with a preferential orientation along the (112) direction for the CIGS. From linear fit of the Tauc plot, we get an energy gap of about 1.16 eV. In addition, from a four points measurements, we get a resistivity of 0.26 Ωcm. We performed an electrochemical characterization in cell of our electrodes. The results show that our samples have a good stability but produce a photocurrent of the order of μA, three orders of magnitude smaller than our targets. The EIS analysis confirm a significant depletion of the species in front of the electrode causing a lower conversion of the species and less current flows.
Resumo:
Obesity has been recognized as a worldwide public health problem. It significantly increases the chances of developing several diseases, including Type II diabetes. The roles of insulin and leptin in obesity involve reactions that can be better understood when they are presented step by step. The aim of this work was to design software with data from some of the most recent publications on obesity, especially those concerning the roles of insulin and leptin in this metabolic disturbance. The most notable characteristic of this software is the use of animations representing the cellular response together with the presentation of recently discovered mechanisms on the participation of insulin and leptin in processes leading to obesity. The software was field tested in the Biochemistry of Nutrition web-based course. After using the software and discussing its contents in chatrooms, students were asked to answer an evaluation survey about the whole activity and the usefulness of the software within the learning process. The teaching assistants (TA) evaluated the software as a tool to help in the teaching process. The students' and TAs' satisfaction was very evident and encouraged us to move forward with the software development and to improve the use of this kind of educational tool in biochemistry classes.
Late Quaternary cycles of mangrove development and decline on the north Australian continental shelf
Resumo:
Mangrove communities in the Australian tropics presently occur as narrow belts of vegetation in estuaries and on sheltered, muddy coasts. Palynological data from continental shelf and deep-sea cores indicate a long-term cyclical component of mangrove development and decline at a regional scale, which can be linked to specific phases of late Quaternary sealevel change. Extensive mangrove development, relative to today, occurs during periods of marine transgression, whereas very diminished mangrove occurs during marine regressions and during rarer periods of relative sea-level stability. Episodes of flourishing mangrove cannot be linked to phases of humid climate, as has been suggested in studies elsewhere. Rather, the cycle of expansion and decline of mangrove communities on a grand scale is explained in terms of contrasting physiographic settings characteristic of continental-shelf coasts during transgressive and regressive phases, in particular by the existence, or lack, of well-developed tidal estuaries. Copyright (C) 1999 John Wiley & Sons, Ltd.
Resumo:
Seasonal climate forecasting offers potential for improving management of crop production risks in the cropping systems of NE Australia. But how is this capability best connected to management practice? Over the past decade, we have pursued participative systems approaches involving simulation-aided discussion with advisers and decision-makers. This has led to the development of discussion support software as a key vehicle for facilitating infusion of forecasting capability into practice. In this paper, we set out the basis of our approach, its implementation and preliminary evaluation. We outline the development of the discussion support software Whopper Cropper, which was designed for, and in close consultation with, public and private advisers. Whopper Cropper consists of a database of simulation output and a graphical user interface to generate analyses of risks associated with crop management options. The charts produced provide conversation pieces for advisers to use with their farmer clients in relation to the significant decisions they face. An example application, detail of the software development process and an initial survey of user needs are presented. We suggest that discussion support software is about moving beyond traditional notions of supply-driven decision support systems. Discussion support software is largely demand-driven and can compliment participatory action research programs by providing cost-effective general delivery of simulation-aided discussions about relevant management actions. The critical role of farm management advisers and dialogue among key players is highlighted. We argue that the discussion support concept, as exemplified by the software tool Whopper Cropper and the group processes surrounding it, provides an effective means to infuse innovations, like seasonal climate forecasting, into farming practice. Crown Copyright (C) 2002 Published by Elsevier Science Ltd. All rights reserved.
Resumo:
Abstract The European Union (EU) is one of the world´s leading donors in official development assistance (ODA) to give a strong weight in the relationship with recipient partner countries, in particular with those that are more dependent on it. Besides the material weight of its funding, the EU has retained historical ties and influence in diplomatic, political and economic terms in many of its ODA recipient partner countries (particular in Sub-Saharan Africa). Since the 2000s, the EU development policy has not only undergone major structural changes in its institutional framework but also has started to face a new international aid scenario. This paper explores why a normative-based EU development policy is being challenged by reformed EU institutions and a new global order, and how the EU is attempting to respond to this context in face of the deepest recession since the end of the Second World War.
Resumo:
Abstract — The analytical methods based on evaluation models of interactive systems were proposed as an alternative to user testing in the last stages of the software development due to its costs. However, the use of isolated behavioral models of the system limits the results of the analytical methods. An example of these limitations relates to the fact that they are unable to identify implementation issues that will impact on usability. With the introduction of model-based testing we are enable to test if the implemented software meets the specified model. This paper presents an model-based approach for test cases generation from the static analysis of source code.
Resumo:
RESUMO: Hoje em dia o software tornou-se num elemento útil na vida das pessoas e das empresas. Existe cada vez mais a necessidade de utilização de aplicações de qualidade, com o objectivo das empresas se diferenciarem no mercado. As empresas produtoras de software procuram aumentar a qualidade nos seus processos de desenvolvimento, com o objectivo de garantir a qualidade do produto final. A dimensão e complexidade do software aumentam a probabilidade do aparecimento de não-conformidades nestes produtos, resultando daí o interesse pela actividade de testes de software ao longo de todo o seu processo de concepção, desenvolvimento e manutenção. Muitos projectos de desenvolvimento de software são entregues com atraso por se verificar que na data prevista para a sua conclusão não têm um desempenho satisfatório ou por não serem confiáveis, ou ainda por serem difíceis de manter. Um bom planeamento das actividades de produção de software significa usualmente um aumento da eficiência de todo o processo produtivo, pois poderá diminuir a quantidade de defeitos e os custos que decorrem da sua correcção, aumentando a confiança na utilização do software e a facilidade da sua operação e manutenção. Assim se reconhece a importância da adopção de boas práticas no desenvolvimento do software. Para isso deve-se utilizar uma abordagem sistemática e organizada com o intuito de produzir software de qualidade. Esta tese descreve os principais modelos de desenvolvimento de software, a importância da engenharia dos requisitos, os processos de testes e principais validações da qualidade de software e como algumas empresas utilizam estes princípios no seu dia-a-dia, com o intuito de produzir um produto final mais fiável. Descreve ainda alguns exemplos como complemento ao contexto da tese. ABSTRACT: Nowadays the software has become a useful element in people's lives and it is increasingly a need for the use of quality applications from companies in order to differentiate in the market. The producers of software increase quality in their development processes, in order to ensuring final product quality. The complexity and size of software, increases the probability of the emergence of non-conformities in these products, this reason increases of interest in the business of testing software throughout the process design, development and maintenance. Many software development projects are postpone because in the date for delivered it’s has not performed satisfactorily, not to be trusted, or because it’s harder to maintain. A good planning of software production activities, usually means an increase in the efficiency of all production process, because it can decrease the number of defects and the costs of it’s correction, increasing the reliability of software in use, and make it easy to operate and maintenance. In this manner, it’s recognized the importance of adopting best practices in software development. To produce quality software, a systematic and organized approach must be used. This thesis describes the main models of software development, the importance of requirements engineering, testing processes and key validation of software quality and how some companies use these principles daily, in order to produce a final product more reliable. It also describes some examples in addition to the context of this thesis.
Resumo:
Mestrado em Engenharia Electrotécnica e de Computadores. Área de Especialização de Telecomunicações.
Resumo:
Para dar resposta aos grandes avanços tecnológicos e, consequentemente, à postura mais exigente dos clientes, a empresa Francisco Parracho – Electrónica Industrial, Lda., que tem actividade no ramo dos elevadores, decidiu introduzir no mercado um controlador dedicado de ecrãs Liquid Crystal Display / Thin Film Transistor (LCD / TFT). O objectivo é substituir um sistema suportado por um computador, caracterizado pelas suas elevadas dimensões e custos, mas incontornável até à data, nomeadamente para resoluções de ecrã elevadas. E assim nasceu este trabalho. Com uma selecção criteriosa de todos os componentes e, principalmente, sem funcionalidades inúteis, obteve-se um sistema embebido com dimensões e custos bem mais reduzidos face ao seu opositor. O ecrã apontado para este projecto é um Thin Film Transistor – Liquid Crystal Display (TFT-LCD) da Sharp de 10.4” de qualidade industrial, com uma resolução de 800 x 600 píxeis a 18 bits por píxel. Para tal, foi escolhido um micro-controlador da ATMEL, um AVR de 32 bits que, entre outras características, possui um controlador LCD que suporta resoluções até 2048 x 2048 píxeis, de 1 a 24 bits por píxel. Atendendo ao facto deste produto ser inserido na área dos elevadores, as funcionalidades, quer a nível do hardware quer a nível do software, foram projectadas para este âmbito. Contudo, o conceito aqui exposto é adjacente a quaisquer outras áreas onde este produto se possa aplicar, até porque o software está feito para se tornar bem flexível. Com a ajuda de um kit de desenvolvimento, foram validados os drivers dos controladores e periféricos base deste projecto. De seguida, aplicou-se esse software numa placa de circuito impresso, elaborada no âmbito deste trabalho, para que fossem cumpridos todos os requisitos requeridos pela empresa patrocinadora: - Apresentação de imagens no ecrã consoante o piso; - Possibilidade de ter um texto horizontalmente deslizante;Indicação animada do sentido do elevador; - Representação do piso com deslizamento vertical; - Descrição sumária do directório de pisos também com deslizamento vertical; - Relógio digital; - Leitura dos conteúdos pretendidos através de um cartão SD/MMC; - Possibilidade de actualização dos conteúdos via USB flash drive.
Resumo:
The recent trends of chip architectures with higher number of heterogeneous cores, and non-uniform memory/non-coherent caches, brings renewed attention to the use of Software Transactional Memory (STM) as a fundamental building block for developing parallel applications. Nevertheless, although STM promises to ease concurrent and parallel software development, it relies on the possibility of aborting conflicting transactions to maintain data consistency, which impacts on the responsiveness and timing guarantees required by embedded real-time systems. In these systems, contention delays must be (efficiently) limited so that the response times of tasks executing transactions are upper-bounded and task sets can be feasibly scheduled. In this paper we assess the use of STM in the development of embedded real-time software, defending that the amount of contention can be reduced if read-only transactions access recent consistent data snapshots, progressing in a wait-free manner. We show how the required number of versions of a shared object can be calculated for a set of tasks. We also outline an algorithm to manage conflicts between update transactions that prevents starvation.
Resumo:
Consider a single processor and a software system. The software system comprises components and interfaces where each component has an associated interface and each component comprises a set of constrained-deadline sporadic tasks. A scheduling algorithm (called global scheduler) determines at each instant which component is active. The active component uses another scheduling algorithm (called local scheduler) to determine which task is selected for execution on the processor. The interface of a component makes certain information about a component visible to other components; the interfaces of all components are used for schedulability analysis. We address the problem of generating an interface for a component based on the tasks inside the component. We desire to (i) incur only a small loss in schedulability analysis due to the interface and (ii) ensure that the amount of space (counted in bits) of the interface is small; this is because such an interface hides as much details of the component as possible. We present an algorithm for generating such an interface.