960 resultados para Computer software -- Development


Relevância:

90.00% 90.00%

Publicador:

Resumo:

Computers have invaded our offices, our homes, cars and coffee-pots; they have become ubiquitous. However, the advance of computing technologies is associated with an increasing lack of “visibility” of the underlying software and hardware technologies. While we use and accept the computer, we neither know its history nor functionality. In this paper, we argue that this is not a healthy situation. Also, recruitment onto UK Computing degree courses is steadily falling; these courses are appearing less attractive to school-leavers. This may be associated with the increasing ubiquity. In this paper we reflect on an MSc. module of instruction, Concepts and Philosophy of Computing, and a BSc. module Computer Games Development developed at the University of Worcester which address these issues. We propose that the elements of these modules form a necessary part of the education of all citizens, and we suggest how this may be realized. We also suggest how to re-enthuse our youth about computing as a discipline and halt the drop in recruitment.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

VALENTIM, R. A. M. ; SOUZA NETO, Plácido Antônio de. O impacto da utilização de design patterns nas métricas e estimativas de projetos de software: a utilização de padrões tem alguma influência nas estimativas?. Revista da FARN, Natal, v. 4, p. 63-74, 2006

Relevância:

90.00% 90.00%

Publicador:

Resumo:

This paper describes two new techniques designed to enhance the performance of fire field modelling software. The two techniques are "group solvers" and automated dynamic control of the solution process, both of which are currently under development within the SMARTFIRE Computational Fluid Dynamics environment. The "group solver" is a derivation of common solver techniques used to obtain numerical solutions to the algebraic equations associated with fire field modelling. The purpose of "group solvers" is to reduce the computational overheads associated with traditional numerical solvers typically used in fire field modelling applications. In an example, discussed in this paper, the group solver is shown to provide a 37% saving in computational time compared with a traditional solver. The second technique is the automated dynamic control of the solution process, which is achieved through the use of artificial intelligence techniques. This is designed to improve the convergence capabilities of the software while further decreasing the computational overheads. The technique automatically controls solver relaxation using an integrated production rule engine with a blackboard to monitor and implement the required control changes during solution processing. Initial results for a two-dimensional fire simulation are presented that demonstrate the potential for considerable savings in simulation run-times when compared with control sets from various sources. Furthermore, the results demonstrate the potential for enhanced solution reliability due to obtaining acceptable convergence within each time step, unlike some of the comparison simulations.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

We present a review of the historical evolution of software engineering, intertwining it with the history of knowledge engineering because “those who cannot remember the past are condemned to repeat it.” This retrospective represents a further step forward to understanding the current state of both types of engineerings; history has also positive experiences; some of them we would like to remember and to repeat. Two types of engineerings had parallel and divergent evolutions but following a similar pattern. We also define a set of milestones that represent a convergence or divergence of the software development methodologies. These milestones do not appear at the same time in software engineering and knowledge engineering, so lessons learned in one discipline can help in the evolution of the other one.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

The present paper introduces a technology-enhanced teaching method that promotes deep learning. Four stages that correspond to four different student cohorts were used for its development and to analyse its effectiveness. The effectiveness of the method has been assessed in terms of examination results as well as results obtained from class response system software statistics. The evidence gathered indicates that the method developed is very effective and its implementation is straightforward. Furthermore, its success in achieving results seems to be independent of the skills and/or experience of the lecturer.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

The Graphical User Interface (GUI) is an integral component of contemporary computer software. A stable and reliable GUI is necessary for correct functioning of software applications. Comprehensive verification of the GUI is a routine part of most software development life-cycles. The input space of a GUI is typically large, making exhaustive verification difficult. GUI defects are often revealed by exercising parts of the GUI that interact with each other. It is challenging for a verification method to drive the GUI into states that might contain defects. In recent years, model-based methods, that target specific GUI interactions, have been developed. These methods create a formal model of the GUI’s input space from specification of the GUI, visible GUI behaviors and static analysis of the GUI’s program-code. GUIs are typically dynamic in nature, whose user-visible state is guided by underlying program-code and dynamic program-state. This research extends existing model-based GUI testing techniques by modelling interactions between the visible GUI of a GUI-based software and its underlying program-code. The new model is able to, efficiently and effectively, test the GUI in ways that were not possible using existing methods. The thesis is this: Long, useful GUI testcases can be created by examining the interactions between the GUI, of a GUI-based application, and its program-code. To explore this thesis, a model-based GUI testing approach is formulated and evaluated. In this approach, program-code level interactions between GUI event handlers will be examined, modelled and deployed for constructing long GUI testcases. These testcases are able to drive the GUI into states that were not possible using existing models. Implementation and evaluation has been conducted using GUITAR, a fully-automated, open-source GUI testing framework.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

El presente artículo es resultado de la investigación: “Diseño de un modelo para mejorar los procesos de estimación de costos para las empresas desarrolladoras de software”. Se presenta una revisión de la literatura a nivel internacional con el fin de identificar tendencias y métodos para realizar estimaciones de costos de software más exactas. Por medio del método predictivo Delphi, un conjunto de expertos pertenecientes al sector de software de Barranquilla clasificaron y valoraron según la probabilidad de ocurrencia cinco escenarios realistas de estimaciones. Se diseñó un experimento completamente aleatorio cuyos resultados apuntaron a dos escenarios estadísticamente similares de manera cualitativa, con lo que se construyó un modelo de análisis basado en tres agentes: Metodología, capacidad del equipo de trabajo y productos tecnológicos; cada uno con tres categorías de cumplimiento para lograr estimaciones más precisas

Relevância:

90.00% 90.00%

Publicador:

Resumo:

Sustainability in software system is still a new practice that most software developers and companies are trying to incorporate into their software development lifecycle and has been largely discussed in academia. Sustainability is a complex concept viewed from economic, environment and social dimensions with several definitions proposed making sometimes the concept of sustainability very fuzzy and difficult to apply and assess in software systems. This has hindered the adoption of sustainability in the software industry. A little research explores sustainability as a quality property of software products and services to answer questions such as; How to quantify sustainability as a quality construct in the same way as other quality attributes such as security, usability and reliability? How can it be applied to software systems? What are the measures and measurement scale of sustainability? The Goal of this research is to investigate the definitions, perceptions and measurement of sustainability from the quality perspective. Grounded in the general theory of software measurement, the aim is to develop a method that decomposes sustainability in factors, criteria and metrics. The Result is a method to quantify and access sustainability of software systems while incorporating management and users concern. Conclusion: The method will empower the ability of companies to easily adopt sustainability while facilitating its integration to the software development process and tools. It will also help companies to measure sustainability of their software products from economic, environmental, social, individual and technological dimension.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

This thesis reports on an investigation of the feasibility and usefulness of incorporating dynamic management facilities for managing sensed context data in a distributed contextaware mobile application. The investigation focuses on reducing the work required to integrate new sensed context streams in an existing context aware architecture. Current architectures require integration work for new streams and new contexts that are encountered. This means of operation is acceptable for current fixed architectures. However, as systems become more mobile the number of discoverable streams increases. Without the ability to discover and use these new streams the functionality of any given device will be limited to the streams that it knows how to decode. The integration of new streams requires that the sensed context data be understood by the current application. If the new source provides data of a type that an application currently requires then the new source should be connected to the application without any prior knowledge of the new source. If the type is similar and can be converted then this stream too should be appropriated by the application. Such applications are based on portable devices (phones, PDAs) for semi-autonomous services that use data from sensors connected to the devices, plus data exchanged with other such devices and remote servers. Such applications must handle input from a variety of sensors, refining the data locally and managing its communication from the device in volatile and unpredictable network conditions. The choice to focus on locally connected sensory input allows for the introduction of privacy and access controls. This local control can determine how the information is communicated to others. This investigation focuses on the evaluation of three approaches to sensor data management. The first system is characterised by its static management based on the pre-pended metadata. This was the reference system. Developed for a mobile system, the data was processed based on the attached metadata. The code that performed the processing was static. The second system was developed to move away from the static processing and introduce a greater freedom of handling for the data stream, this resulted in a heavy weight approach. The approach focused on pushing the processing of the data into a number of networked nodes rather than the monolithic design of the previous system. By creating a separate communication channel for the metadata it is possible to be more flexible with the amount and type of data transmitted. The final system pulled the benefits of the other systems together. By providing a small management class that would load a separate handler based on the incoming data, Dynamism was maximised whilst maintaining ease of code understanding. The three systems were then compared to highlight their ability to dynamically manage new sensed context. The evaluation took two approaches, the first is a quantitative analysis of the code to understand the complexity of the relative three systems. This was done by evaluating what changes to the system were involved for the new context. The second approach takes a qualitative view of the work required by the software engineer to reconfigure the systems to provide support for a new data stream. The evaluation highlights the various scenarios in which the three systems are most suited. There is always a trade-o↵ in the development of a system. The three approaches highlight this fact. The creation of a statically bound system can be quick to develop but may need to be completely re-written if the requirements move too far. Alternatively a highly dynamic system may be able to cope with new requirements but the developer time to create such a system may be greater than the creation of several simpler systems.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

Due to the growth of design size and complexity, design verification is an important aspect of the Logic Circuit development process. The purpose of verification is to validate that the design meets the system requirements and specification. This is done by either functional or formal verification. The most popular approach to functional verification is the use of simulation based techniques. Using models to replicate the behaviour of an actual system is called simulation. In this thesis, a software/data structure architecture without explicit locks is proposed to accelerate logic gate circuit simulation. We call thus system ZSIM. The ZSIM software architecture simulator targets low cost SIMD multi-core machines. Its performance is evaluated on the Intel Xeon Phi and 2 other machines (Intel Xeon and AMD Opteron). The aim of these experiments is to: • Verify that the data structure used allows SIMD acceleration, particularly on machines with gather instructions ( section 5.3.1). • Verify that, on sufficiently large circuits, substantial gains could be made from multicore parallelism ( section 5.3.2 ). • Show that a simulator using this approach out-performs an existing commercial simulator on a standard workstation ( section 5.3.3 ). • Show that the performance on a cheap Xeon Phi card is competitive with results reported elsewhere on much more expensive super-computers ( section 5.3.5 ). To evaluate the ZSIM, two types of test circuits were used: 1. Circuits from the IWLS benchmark suit [1] which allow direct comparison with other published studies of parallel simulators.2. Circuits generated by a parametrised circuit synthesizer. The synthesizer used an algorithm that has been shown to generate circuits that are statistically representative of real logic circuits. The synthesizer allowed testing of a range of very large circuits, larger than the ones for which it was possible to obtain open source files. The experimental results show that with SIMD acceleration and multicore, ZSIM gained a peak parallelisation factor of 300 on Intel Xeon Phi and 11 on Intel Xeon. With only SIMD enabled, ZSIM achieved a maximum parallelistion gain of 10 on Intel Xeon Phi and 4 on Intel Xeon. Furthermore, it was shown that this software architecture simulator running on a SIMD machine is much faster than, and can handle much bigger circuits than a widely used commercial simulator (Xilinx) running on a workstation. The performance achieved by ZSIM was also compared with similar pre-existing work on logic simulation targeting GPUs and supercomputers. It was shown that ZSIM simulator running on a Xeon Phi machine gives comparable simulation performance to the IBM Blue Gene supercomputer at very much lower cost. The experimental results have shown that the Xeon Phi is competitive with simulation on GPUs and allows the handling of much larger circuits than have been reported for GPU simulation. When targeting Xeon Phi architecture, the automatic cache management of the Xeon Phi, handles and manages the on-chip local store without any explicit mention of the local store being made in the architecture of the simulator itself. However, targeting GPUs, explicit cache management in program increases the complexity of the software architecture. Furthermore, one of the strongest points of the ZSIM simulator is its portability. Note that the same code was tested on both AMD and Xeon Phi machines. The same architecture that efficiently performs on Xeon Phi, was ported into a 64 core NUMA AMD Opteron. To conclude, the two main achievements are restated as following: The primary achievement of this work was proving that the ZSIM architecture was faster than previously published logic simulators on low cost platforms. The secondary achievement was the development of a synthetic testing suite that went beyond the scale range that was previously publicly available, based on prior work that showed the synthesis technique is valid.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

Some authors have shown the need of understanding the technological structuring process in contemporary firms. From this perspective, the software industry is a very important element because it provides products and services directly to many organizations from many fields. In this case, the Brazilian software industry has some peculiarities that distinguish it from other industries located in developed countries, which makes its understanding even more relevant. There is evidence that local firms take different strategies and structural configurations to enter into a market naturally dominated by large multinational firms. Therefore, this study aims to understand not only the structural configurations assumed by domestic firms but also the dynamic and the process that lead to these different configurations. To do so, this PhD dissertation investigates the institutional environment, its entities and the isomorphic movements, by employing an exploratory, descriptive and explanatory multiple cases study. Eight software development companies from the Recife's information technology Cluster were visited. Also, a form was applied and an interview with one of the main firm s professional was conducted. Although the study is predominantly qualitative, part of the data was analyzed through charts and graphs, providing a companies and environment overview that was very useful to analysis done through the interviews interpretation. As a result, it was realized that companies are structured around hybrids business models from two ideal types of software development companies, which are: software factory and technology-based company. Regarding the development process, it was found that there is a balanced distribution between the traditional and agile development paradigm. Among the traditional methodologies, the Rational Unified Process (RUP) is predominant. The Scrum is the most used methodology among the organizations based on the Agile Manifesto's principles. Regarding the structuring process, each institutional entity acts in such way that generates different isomorphic pressure. Emphasis was given to entities such as customers, research agencies, clusters, market-leading businesses, public universities, incubators, software industry organizations, technology vendors, development tool suppliers and manager s school and background because they relate themselves in a close way with the software firms. About this relationship, a dual and bilateral influence was found. Finally, the structuring level of the organizational field has been also identified as low, which gives a chance to organizational actors of acting independently

Relevância:

90.00% 90.00%

Publicador:

Resumo:

Following a drop in estrogen in the period of menopause some women begin to lose bone mass more than 1% per year reaching the end of five years with loss greater than 25%. In this regard, factors such as older age, low calcium intake and premature menopause favor the onset of osteoporosis. Preventive methods such as nutritional counseling to a proper diet and the support of technology through applications that assess dietary intake are essential. Thus, this study aimed to develop an application for Android® platform focused on the evaluation of nutritional and organic conditions involved in bone health and risks for developing osteoporosis in postmenopausal women. To achieve this goal we proceeded to a study of 72 women aged 46-79 years, from the physical exercise for bone health of the Laboratory for Research in Biochemistry and Densitometry the Federal Technological University of Paraná program. Data were collected in the second half of 2014 through tests Bone Densitometry and Body Composition, Blood Tests, Anthropometric data and Nutrition Assessment. The study included women with a current diagnosis of osteopenia or osteoporosis primary, aged more than 45 years postmenopausal. For the assessment of bone mineral density and body composition used the device Absorptiometry Dual Energy X-ray (DXA) brand Hologic Discovery TM Model A. For anthropometric assessment was included to body mass, height, abdominal circumference, Waist circumference and hip circumference. The instrument for assessing food consumption was used Recall 24 hours a day (24HR). The estimated intake of energy and nutrients was carried from the tabulation of the food eaten in the Software Diet Pro 4®. In a sub sample of 30 women with osteopenia / osteoporosis serum calcium and alkaline phosphatase tests were performed. The results demonstrated a group of women (n = 30) average calcium intake of 570mg / day (± 340). The analysis showed a mean serum calcium within the normal range (10,20mg / dl ± 0.32) and average values and slightly increased alkaline phosphatase (105.40 U / L ± 23.70). Furthermore, there was a significant correlation between the consumption of protein and the optimal daily intake of calcium (0.375 p-value 0.05). Based on these findings, we developed an application early stage in Android® platform operating system Google®, being called OsteoNutri. We chose to use Java Eclipse® where it was executed Android® version of the project; choice of application icons and setting the visual editor for building the application layouts. The DroidDraw® was used for development of the three application GUIs. For practical tests we used a cell compatible with the version that was created (4.4 or higher). The prototype was developed in conjunction with the Group and Instrumentation Applications Development (GDAI) of the Federal Technological University of Paraná. So this application can be considered an important tool in dietary control, allowing closer control consumption of calcium and dietary proteins.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

This document presents GEmSysC, an unified cryptographic API for embedded systems. Software layers implementing this API can be built over existing libraries, allowing embedded software to access cryptographic functions in a consistent way that does not depend on the underlying library. The API complies to good practices for API design and good practices for embedded software development and took its inspiration from other cryptographic libraries and standards. The main inspiration for creating GEmSysC was the CMSIS-RTOS standard, which defines an unified API for embedded software in an implementation-independent way, but targets operating systems instead of cryptographic functions. GEmSysC is made of a generic core and attachable modules, one for each cryptographic algorithm. This document contains the specification of the core of GEmSysC and three of its modules: AES, RSA and SHA-256. GEmSysC was built targeting embedded systems, but this does not restrict its use only in such systems – after all, embedded systems are just very limited computing devices. As a proof of concept, two implementations of GEmSysC were made. One of them was built over wolfSSL, which is an open source library for embedded systems. The other was built over OpenSSL, which is open source and a de facto standard. Unlike wolfSSL, OpenSSL does not specifically target embedded systems. The implementation built over wolfSSL was evaluated in a Cortex- M3 processor with no operating system while the implementation built over OpenSSL was evaluated on a personal computer with Windows 10 operating system. This document displays test results showing GEmSysC to be simpler than other libraries in some aspects. These results have shown that both implementations incur in little overhead in computation time compared to the cryptographic libraries themselves. The overhead of the implementation has been measured for each cryptographic algorithm and is between around 0% and 0.17% for the implementation over wolfSSL and between 0.03% and 1.40% for the one over OpenSSL. This document also presents the memory costs for each implementation.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

Over the past several decades, thousands of otoliths, bivalve shells, and scales have been collected for the purposes of age determination and remain archived in European and North American fisheries laboratories. Advances in digital imaging and computer software combined with techniques developed by tree-ring scientists provide a means by which to extract additional levels of information in these calcified structures and generate annually resolved (one value per year), multidecadal time-series of population-level growth anomalies. Chemical and isotopic properties may also be extracted to provide additional information regarding the environmental conditions these organisms experienced.Given that they are exactly placed in time, chronologies can be directly compared to instrumental climate records, chronologies from other regions or species, or time-seriesof other biological phenomena. In this way, chronologies may be used to reconstruct historical ranges of environmental variability, identify climatic drivers of growth, establish linkages within and among species, and generate ecosystem-level indicators. Following the first workshop in Hamburg, Germany, in December 2014, the second workshop on Growth increment Chronologies in Marine Fish: climate-ecosystem interactions in the North Atlantic (WKGIC2) met at the Mediterranean Institute for Advanced Studies headquarters in Esporles, Spain, on 18–22 April 2016, chaired by Bryan Black (USA) and Christoph Stransky (Germany).Thirty-six participants from fifteen different countries attended. Objectives were to i) review the applications of chronologies developed from growth-increment widths in the hard parts (otoliths, shells, scales) of marine fish and bivalve species ii) review the fundamentals of crossdating and chronology development, iii) discuss assumptions and limitations of these approaches, iv) measure otolith growth-increment widths in image analysis software, v) learn software to statistically check increment dating accuracy, vi) generate a growth increment chronology and relate it to climate indices, and vii) initiate cooperative projects or training exercises to commence after the workshop.The workshop began with an overview of tree-ring techniques of chronology development, including a hands-on exercise in cross dating. Next, we discussed the applications of fish and bivalve biochronologies and the range of issues that could be addressed. We then reviewed key assumptions and limitations, especially those associated with short-lived species for which there are numerous and extensive otolith archives in European fisheries labs. Next, participants were provided with images of European plaice otoliths from the North Sea and taught to measure increment widths in image analysis software. Upon completion of measurements, techniques of chronology development were discussed and contrasted to those that have been applied for long-lived species. Plaice growth time-series were then related to environmental variability using the KNMI Climate Explorer. Finally, potential future collaborations and funding opportunities were discussed, and there was a clear desire to meet again to compare various statistical techniques for chronology development using a range existing fish, bivalve, and tree growth-increment datasets. Overall, we hope to increase the use of these techniques, and over the long term, develop networks of biochronologies for integrative analyses of ecosystem functioning and relationships to long-term climate variability and fishing pressure.