975 resultados para Developing Software


Relevância:

100.00% 100.00%

Publicador:

Resumo:

This report demonstrates the development of: • Development of software agents for data mining • Link data mining to building model in virtual environments • Link knowledge development with building model in virtual environments • Demonstration of software agents for data mining • Populate with maintenance data

Relevância:

70.00% 70.00%

Publicador:

Resumo:

Free and open source software development is an alternative to traditional software engineering as an approach to the development of complex software systems. It is a way of developing software based on geographically distributed teams of volunteers without apparent central plan or traditional mechanisms of coordination. The purpose of this thesis is to summarize the current knowledge about free and open source software development and explore the ways on which further understanding on it could be gained. The results of research on the field as well as the research methods are introduced and discussed. Also adapting software process metrics to the context of free and open source software development is illustrated and the possibilities to utilize them as tools to validate other research are discussed.

Relevância:

70.00% 70.00%

Publicador:

Resumo:

When developing software for autonomous mobile robots, one has to inevitably tackle some kind of perception. Moreover, when dealing with agents that possess some level of reasoning for executing their actions, there is the need to model the environment and the robot internal state in a way that it represents the scenario in which the robot operates. Inserted in the ATRI group, part of the IEETA research unit at Aveiro University, this work uses two of the projects of the group as test bed, particularly in the scenario of robotic soccer with real robots. With the main objective of developing algorithms for sensor and information fusion that could be used e ectively on these teams, several state of the art approaches were studied, implemented and adapted to each of the robot types. Within the MSL RoboCup team CAMBADA, the main focus was the perception of ball and obstacles, with the creation of models capable of providing extended information so that the reasoning of the robot can be ever more e ective. To achieve it, several methodologies were analyzed, implemented, compared and improved. Concerning the ball, an analysis of ltering methodologies for stabilization of its position and estimation of its velocity was performed. Also, with the goal keeper in mind, work has been done to provide it with information of aerial balls. As for obstacles, a new de nition of the way they are perceived by the vision and the type of information provided was created, as well as a methodology for identifying which of the obstacles are team mates. Also, a tracking algorithm was developed, which ultimately assigned each of the obstacles a unique identi er. Associated with the improvement of the obstacles perception, a new algorithm of estimating reactive obstacle avoidance was created. In the context of the SPL RoboCup team Portuguese Team, besides the inevitable adaptation of many of the algorithms already developed for sensor and information fusion and considering that it was recently created, the objective was to create a sustainable software architecture that could be the base for future modular development. The software architecture created is based on a series of di erent processes and the means of communication among them. All processes were created or adapted for the new architecture and a base set of roles and behaviors was de ned during this work to achieve a base functional framework. In terms of perception, the main focus was to de ne a projection model and camera pose extraction that could provide information in metric coordinates. The second main objective was to adapt the CAMBADA localization algorithm to work on the NAO robots, considering all the limitations it presents when comparing to the MSL team, especially in terms of computational resources. A set of support tools were developed or improved in order to support the test and development in both teams. In general, the work developed during this thesis improved the performance of the teams during play and also the e ectiveness of the developers team when in development and test phases.

Relevância:

70.00% 70.00%

Publicador:

Resumo:

Requirements engineering is not straightforward for any software development team. Developing software when team members are located in widely distributed geographic locations poses many challenges for developers, particularly during the requirements engineering phase. This paper reports on a case study concerning a large software development project that was completed in just seven months between users located in the UK and software developers from an international software house based in New Zealand. The case indicates that while “true” global requirements engineering may be desirable in achieving economy of resources, a “hybrid” structure of requirements engineering processes is more realistic so that lasting relationships with clients may be formed, and requirements engineering activities achieved. The main impediment to the process of requirements engineering during global software development, as recounted by the team members in this case, is communication. Communication issues may be further described in terms of four categories: distribution of the clients and the development team, distribution of the development team, cultural differences between the clients and the development team and cultural differences among the development team.

Relevância:

70.00% 70.00%

Publicador:

Resumo:

Software Products Lines (SPL) is a software engineering approach to developing software system families that share common features and differ in other features according to the requested software systems. The adoption of the SPL approach can promote several benefits such as cost reduction, product quality, productivity, and time to market. On the other hand, the SPL approach brings new challenges to the software evolution that must be considered. Recent research work has explored and proposed automated approaches based on code analysis and traceability techniques for change impact analysis in the context of SPL development. There are existing limitations concerning these approaches such as the customization of the analysis functionalities to address different strategies for change impact analysis, and the change impact analysis of fine-grained variability. This dissertation proposes a change impact analysis tool for SPL development, called Squid Impact Analyzer. The tool allows the implementation of change impact analysis based on information from variability modeling, mapping of variability to code assets, and existing dependency relationships between code assets. An assessment of the tool is conducted through an experiment that compare the change impact analysis results provided by the tool with real changes applied to several evolution releases from a SPL for media management in mobile devices

Relevância:

70.00% 70.00%

Publicador:

Resumo:

Software testing is a key aspect of software reliability and quality assurance in a context where software development constantly has to overcome mammoth challenges in a continuously changing environment. One of the characteristics of software testing is that it has a large intellectual capital component and can thus benefit from the use of the experience gained from past projects. Software testing can, then, potentially benefit from solutions provided by the knowledge management discipline. There are in fact a number of proposals concerning effective knowledge management related to several software engineering processes. Objective: We defend the use of a lesson learned system for software testing. The reason is that such a system is an effective knowledge management resource enabling testers and managers to take advantage of the experience locked away in the brains of the testers. To do this, the experience has to be gathered, disseminated and reused. Method: After analyzing the proposals for managing software testing experience, significant weaknesses have been detected in the current systems of this type. The architectural model proposed here for lesson learned systems is designed to try to avoid these weaknesses. This model (i) defines the structure of the software testing lessons learned; (ii) sets up procedures for lesson learned management; and (iii) supports the design of software tools to manage the lessons learned. Results: A different approach, based on the management of the lessons learned that software testing engineers gather from everyday experience, with two basic goals: usefulness and applicability. Conclusion: The architectural model proposed here lays the groundwork to overcome the obstacles to sharing and reusing experience gained in the software testing and test management. As such, it provides guidance for developing software testing lesson learned systems.

Relevância:

70.00% 70.00%

Publicador:

Resumo:

The process of developing software is a complex undertaking involving multiple stakeholders. While the intentions of these parties might vary to some extent, the ultimate goal can be seen as a satisfactory product. Lean and agile software development practices strive toward this and they place customer contentment as one of the highest aims of the process. An important aspect of any development process is the act of innovation. Without it, nothing progresses and the whole process is unnecessary. As a target domain expert, the customer is an important part of effective innovation. Problems arise, however, when the customer is not actively taking part in the activities. Lack of familiarity with software development can easily cause such issues. Unfortunately, the amount of research conducted on product innovation is unimpressive. This makes it difficult to formulate a recommended approach on stimulating the customer and encouraging a more active participation. Ultimately, a small set of high-level guidelines were identified from the available literary resources for inducing innovation. To conclude, this thesis presents the findings made during the development of a small web application and compares them to the aforementioned literature findings. While the guidelines seem to provide promising results, further empirical research is needed to attain more significant conclusions.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

The Australian e-Health Research Centre in collaboration with the Queensland University of Technology's Paediatric Spine Research Group is developing software for visualisation and manipulation of large three-dimensional (3D) medical image data sets. The software allows the extraction of anatomical data from individual patients for use in preoperative planning. State-of-the-art computer technology makes it possible to slice through the image dataset at any angle, or manipulate 3D representations of the data instantly. Although the software was initially developed to support planning for scoliosis surgery, it can be applied to any dataset whether obtained from computed tomography, magnetic resonance imaging or any other imaging modality.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

The Alliance for Coastal Technologies (ACT) convened a workshop, sponsored by the Hawaii-Pacific and Alaska Regional Partners, entitled Underwater Passive Acoustic Monitoring for Remote Regions at the Hawaii Institute of Marine Biology from February 7-9, 2007. The workshop was designed to summarize existing passive acoustic technologies and their uses, as well as to make strategic recommendations for future development and collaborative programs that use passive acoustic tools for scientific investigation and resource management. The workshop was attended by 29 people representing three sectors: research scientists, resource managers, and technology developers. The majority of passive acoustic tools are being developed by individual scientists for specific applications and few tools are available commercially. Most scientists are developing hydrophone-based systems to listen for species-specific information on fish or cetaceans; a few scientists are listening for biological indicators of ecosystem health. Resource managers are interested in passive acoustics primarily for vessel detection in remote protected areas and secondarily to obtain biological and ecological information. The military has been monitoring with hydrophones for decades;however, data and signal processing software has not been readily available to the scientific community, and future collaboration is greatly needed. The challenges that impede future development of passive acoustics are surmountable with greater collaboration. Hardware exists and is accessible; the limits are in the software and in the interpretation of sounds and their correlation with ecological events. Collaboration with the military and the private companies it contracts will assist scientists and managers with obtaining and developing software and data analysis tools. Collaborative proposals among scientists to receive larger pools of money for exploratory acoustic science will further develop the ability to correlate noise with ecological activities. The existing technologies and data analysis are adequate to meet resource managers' needs for vessel detection. However, collaboration is needed among resource managers to prepare large-scale programs that include centralized processing in an effort to address the lack of local capacity within management agencies to analyze and interpret the data. Workshop participants suggested that ACT might facilitate such collaborations through its website and by providing recommendations to key agencies and programs, such as DOD, NOAA, and I00s. There is a need to standardize data formats and archive acoustic environmental data at the national and international levels. Specifically, there is a need for local training and primers for public education, as well as by pilot demonstration projects, perhaps in conjunction with National Marine Sanctuaries. Passive acoustic technologies should be implemented immediately to address vessel monitoring needs. Ecological and health monitoring applications should be developed as vessel monitoring programs provide additional data and opportunities for more exploratory research. Passive acoustic monitoring should also be correlated with water quality monitoring to ease integration into long-term monitoring programs, such as the ocean observing systems. [PDF contains 52 pages]

Relevância:

60.00% 60.00%

Publicador:

Resumo:

Thomas, L., Ratcliffe, M., and Robertson, A. 2003. Code warriors and code-a-phobes: a study in attitude and pair programming. SIGCSE Bull. 35, 1 (Jan. 2003), 363-367.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

In this article the authors describe the application development RExMobile and the importance of remote experimentation via mobile devices, especially smartphones simple, beyond the space provided for this application in education. The article deals the creation, software and hardware that provide an interactive and dynamic way to attract more students to use these experiments remote, serving as support to teachers to science teaching from its initial series. The ease and availability of smartphones, even these students of basic education, permits the reach of new users and in different places. Thus, the practice of remote experimentation in mobile devices enables new spaces for access and interaction. Are used for developing software free or low cost, HTML5 and jQuery Mobile framework, that enable the creation of pages compatible with different mobile operating systems such as iOS, Android, Windows Phone, some Symbian, among others. Also are demonstrated patterns layouts that allow greater accessibility.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

Dans le développement logiciel en industrie, les documents de spécification jouent un rôle important pour la communication entre les analystes et les développeurs. Cependant, avec le temps, les changements de personel et les échéances toujours plus courtes, ces documents sont souvent obsolètes ou incohérents avec l'état effectif du système, i.e., son code source. Pourtant, il est nécessaire que les composants du système logiciel soient conservés à jour et cohérents avec leurs documents de spécifications pour faciliter leur développement et maintenance et, ainsi, pour en réduire les coûts. Maintenir la cohérence entre spécification et code source nécessite de pouvoir représenter les changements sur les uns et les autres et de pouvoir appliquer ces changements de manière cohérente et automatique. Nous proposons une solution permettant de décrire une représentation d'un logiciel ainsi qu'un formalisme mathématique permettant de décrire et de manipuler l'évolution des composants de ces représentations. Le formalisme est basé sur les triplets de Hoare pour représenter les transformations et sur la théorie des groupes et des homomorphismes de groupes pour manipuler ces transformations et permettrent leur application sur les différentes représentations du système. Nous illustrons notre formalisme sur deux représentations d'un système logiciel : PADL, une représentation architecturale de haut niveau (semblable à UML), et JCT, un arbre de syntaxe abstrait basé sur Java. Nous définissons également des transformations représentant l'évolution de ces représentations et la transposition permettant de reporter les transformations d'une représentation sur l'autre. Enfin, nous avons développé et décrivons brièvement une implémentation de notre illustration, un plugiciel pour l'IDE Eclipse détectant les transformations effectuées sur le code par les développeurs et un générateur de code pour l'intégration de nouvelles représentations dans l'implémentation.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

The process of developing software that takes advantage of multiple processors is commonly referred to as parallel programming. For various reasons, this process is much harder than the sequential case. For decades, parallel programming has been a problem for a small niche only: engineers working on parallelizing mostly numerical applications in High Performance Computing. This has changed with the advent of multi-core processors in mainstream computer architectures. Parallel programming in our days becomes a problem for a much larger group of developers. The main objective of this thesis was to find ways to make parallel programming easier for them. Different aims were identified in order to reach the objective: research the state of the art of parallel programming today, improve the education of software developers about the topic, and provide programmers with powerful abstractions to make their work easier. To reach these aims, several key steps were taken. To start with, a survey was conducted among parallel programmers to find out about the state of the art. More than 250 people participated, yielding results about the parallel programming systems and languages in use, as well as about common problems with these systems. Furthermore, a study was conducted in university classes on parallel programming. It resulted in a list of frequently made mistakes that were analyzed and used to create a programmers' checklist to avoid them in the future. For programmers' education, an online resource was setup to collect experiences and knowledge in the field of parallel programming - called the Parawiki. Another key step in this direction was the creation of the Thinking Parallel weblog, where more than 50.000 readers to date have read essays on the topic. For the third aim (powerful abstractions), it was decided to concentrate on one parallel programming system: OpenMP. Its ease of use and high level of abstraction were the most important reasons for this decision. Two different research directions were pursued. The first one resulted in a parallel library called AthenaMP. It contains so-called generic components, derived from design patterns for parallel programming. These include functionality to enhance the locks provided by OpenMP, to perform operations on large amounts of data (data-parallel programming), and to enable the implementation of irregular algorithms using task pools. AthenaMP itself serves a triple role: the components are well-documented and can be used directly in programs, it enables developers to study the source code and learn from it, and it is possible for compiler writers to use it as a testing ground for their OpenMP compilers. The second research direction was targeted at changing the OpenMP specification to make the system more powerful. The main contributions here were a proposal to enable thread-cancellation and a proposal to avoid busy waiting. Both were implemented in a research compiler, shown to be useful in example applications, and proposed to the OpenMP Language Committee.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

Develop software is still a risky business. After 60 years of experience, this community is still not able to consistently build Information Systems (IS) for organizations with predictable quality, within previously agreed budget and time constraints. Although software is changeable we are still unable to cope with the amount and complexity of change that organizations demand for their IS. To improve results, developers followed two alternatives: Frameworks that increase productivity but constrain the flexibility of possible solutions; Agile ways of developing software that keep flexibility with less upfront commitments. With strict frameworks, specific hacks have to be put in place to get around the framework construction options. In time this leads to inconsistent architectures that are harder to maintain due to incomplete documentation and human resources turnover. The main goals of this work is to create a new way to develop flexible IS for organizations, using web technologies, in a faster, better and cheaper way that is more suited to handle organizational change. To do so we propose an adaptive object model that uses a new ontology for data and action with strict normalizing rules. These rules should bound the effects of changes that can be better tested and therefore corrected. Interfaces are built with templates of resources that can be reused and extended in a flexible way. The “state of the world” for each IS is determined by all production and coordination acts that agents performed over time, even those performed by external systems. When bugs are found during maintenance, their past cascading effects can be checked through simulation, re-running the log of transaction acts over time and checking results with previous records. This work implements a prototype with part of the proposed system in order to have a preliminary assessment its feasibility and limitations.