183 resultados para Migrazione database DBMS processo software dati
Resumo:
The number of software vendors offering ‘Software-as-a-Service’ has been increasing in recent years. In the Software-as-a-Service model software is operated by the software vendor and delivered to the customer as a service. Existing business models and industry structures are challenged by the changes to the deployment and pricing model compared to traditional software. However, the full implications on the way companies create, deliver and capture value are not yet sufficiently analyzed. Current research is scattered on specific aspects, only a few studies provide a more holistic view of the impact from a business model perspective. For vendors it is, however, crucial to be aware of the potentially far reaching consequences of Software-as-a-Service. Therefore, a literature review and three exploratory case studies of leading software vendors are used to evaluate possible implications of Software-as-a-Service on business models. The results show an impact on all business model building blocks and highlight in particular the often less articulated impact on key activities, customer relationship and key partnerships for leading software vendors and show related challenges, for example, with regard to the integration of development and operations processes. The observed implications demonstrate the disruptive character of the concept and identify future research requirements.
Resumo:
Single particle analysis (SPA) coupled with high-resolution electron cryo-microscopy is emerging as a powerful technique for the structure determination of membrane protein complexes and soluble macromolecular assemblies. Current estimates suggest that ∼104–105 particle projections are required to attain a 3 Å resolution 3D reconstruction (symmetry dependent). Selecting this number of molecular projections differing in size, shape and symmetry is a rate-limiting step for the automation of 3D image reconstruction. Here, we present SwarmPS, a feature rich GUI based software package to manage large scale, semi-automated particle picking projects. The software provides cross-correlation and edge-detection algorithms. Algorithm-specific parameters are transparently and automatically determined through user interaction with the image, rather than by trial and error. Other features include multiple image handling (∼102), local and global particle selection options, interactive image freezing, automatic particle centering, and full manual override to correct false positives and negatives. SwarmPS is user friendly, flexible, extensible, fast, and capable of exporting boxed out projection images, or particle coordinates, compatible with downstream image processing suites.
Resumo:
Teachers are under increasing pressure from government and school management to incorporate technology into lessons. They need to consider which technologies can most effectively enhance subject learning, encourage higher order thinking skills and support the performance of authentic tasks. This chapter reviews the practical and theoretical tools that have been developed to aid teachers in selecting software and reviews the software assessment methodologies from the 1980s to the present day. It concludes that teachers need guidance to structure the evaluation of technology, to consider its educational affordances, its usability, its suitability for the students and the classroom environment and its fit to the teachers’ preferred pedagogies.
Resumo:
Self-segregation and compartimentalisation are observed experimentally to occur spontaneously on live membranes as well as reconstructed model membranes. It is believed that many of these processes are caused or supported by anomalous diffusive behaviours of biomolecules on membranes due to the complex and heterogeneous nature of these environments. These phenomena are on the one hand of great interest in biology, since they may be an important way for biological systems to selectively localize receptors, regulate signaling or modulate kinetics; and on the other, they provide an inspiration for engineering designs that mimick natural systems. We present an interactive software package we are developing for the purpose of simulating such processes numerically using a fundamental Monte Carlo approach. This program includes the ability to simulate kinetics and mass transport in the presence of either mobile or immobile obstacles and other relevant structures such as liquid-ordered lipid microdomains. We also present preliminary simulation results regarding the selective spatial localization and chemical kinetics modulating power of immobile obstacles on the membrane, obtained using the program.
Resumo:
The decision to represent the USDL abstract syntax as a metamodel, shown as a set of UML diagrams, has two main benefits: the ability to show a well- understood standard graphical representation of the concepts and their relation- ships to one another, and the ability to use object-oriented frameworks such as Eclipse Modeling Framework (EMF) to assist in the automated generation of tool support for USDL service descriptions.
Resumo:
The growing importance and need of data processing for information extraction is vital for Web databases. Due to the sheer size and volume of databases, retrieval of relevant information as needed by users has become a cumbersome process. Information seekers are faced by information overloading - too many result sets are returned for their queries. Moreover, too few or no results are returned if a specific query is asked. This paper proposes a ranking algorithm that gives higher preference to a user’s current search and also utilizes profile information in order to obtain the relevant results for a user’s query.
Resumo:
Software forms an important part of the interface between citizens and their government. An increasing amount of government functions are being performed, controlled, or delivered electronically. This software, like all language, is never value-neutral, but must, to some extent, reflect the values of the coder and proprietor. The move that many governments are making towards e-governance, and the increasing reliance that is being placed upon software in government, necessitates a rethinking of the relationships of power and control that are embodied in software.
Resumo:
This paper examines the integration of computing technologies into music education research in a way informed by constructivism. In particular, this paper focuses on an approach established by Jeanne Bamberger, which the author also employs, that integrates software design, pedagogical exploration, and the building of music education theory. In this tradition, researchers design software and associated activities to facilitate the interactive manipulation of musical structures and ideas. In short, this approach focuses on designing experiences and tools that support musical thinking and doing. In comparing the work of Jean Bamberger with that of the author, this paper highlights and discusses issues of significance and identifies lessons for future research.
Resumo:
Projects funded by the Australian National Data Service(ANDS). The specific projects that were funded included: a) Greenhouse Gas Emissions Project (N2O) with Prof. Peter Grace from QUT’s Institute of Sustainable Resources. b) Q150 Project for the management of multimedia data collected at Festival events with Prof. Phil Graham from QUT’s Institute of Creative Industries. c) Bio-diversity environmental sensing with Prof. Paul Roe from the QUT Microsoft eResearch Centre. For the purposes of these projects the Eclipse Rich Client Platform (Eclipse RCP) was chosen as an appropriate software development framework within which to develop the respective software. This poster will present a brief overview of the requirements of the projects, an overview of the experiences of the project team in using Eclipse RCP, report on the advantages and disadvantages of using Eclipse and it’s perspective on Eclipse as an integrated tool for supporting future data management requirements.
Resumo:
Design for Manufacturing (DFM) is a highly integral methodology in product development, starting from the concept development phase, with the aim of improving manufacturing productivity and maintaining product quality. While Design for Assembly (DFA) is focusing on elimination or combination of parts with other components (Boothroyd, Dewhurst and Knight, 2002), which in most cases relates to performing a function and manufacture operation in a simpler way, DFM is following a more holistic approach. During DFM, the considerable background work required for the conceptual phase is compensated for by a shortening of later development phases. Current DFM projects normally apply an iterative step-by-step approach and eventually transfer to the developer team. Although DFM has been a well established methodology for about 30 years, a Fraunhofer IAO study from 2009 found that DFM was still one of the key challenges of the German Manufacturing Industry. A new, knowledge based approach to DFM, eliminating steps of DFM, was introduced in Paul and Al-Dirini (2009). The concept focuses on a concurrent engineering process between the manufacturing engineering and product development systems, while current product realization cycles depend on a rigorous back-and-forth examine-and-correct approach so as to ensure compatibility of any proposed design to the DFM rules and guidelines adopted by the company. The key to achieving reductions is to incorporate DFM considerations into the early stages of the design process. A case study for DFM application in an automotive powertrain engineering environment is presented. It is argued that a DFM database needs to be interfaced to the CAD/CAM software, which will restrict designers to the DFM criteria. Consequently, a notable reduction of development cycles can be achieved. The case study is following the hypothesis that current DFM methods do not improve product design in a manner claimed by the DFM method. The critical case was to identify DFA/DFM recommendations or program actions with repeated appearance in different sources. Repetitive DFM measures are identified, analyzed and it is shown how a modified DFM process can mitigate a non-fully integrated DFM approach.
Resumo:
Post-deployment maintenance and evolution can account for up to 75% of the cost of developing a software system. Software refactoring can reduce the costs associated with evolution by improving system quality. Although refactoring can yield benefits, the process includes potentially complex, error-prone, tedious and time-consuming tasks. It is these tasks that automated refactoring tools seek to address. However, although the refactoring process is well-defined, current refactoring tools do not support the full process. To develop better automated refactoring support, we have completed a usability study of software refactoring tools. In the study, we analysed the task of software refactoring using the ISO 9241-11 usability standard and Fitts' List of task allocation. Expanding on this analysis, we reviewed 11 collections of usability guidelines and combined these into a single list of 38 guidelines. From this list, we developed 81 usability requirements for refactoring tools. Using these requirements, the software refactoring tools Eclipse 3.2, Condenser 1.05, RefactorIT 2.5.1, and Eclipse 3.2 with the Simian UI 2.2.12 plugin were studied. Based on the analysis, we have selected a subset of the requirements that can be incorporated into a prototype refactoring tool intended to address the full refactoring process.
Resumo:
With the large diffusion of Business Process Managemen (BPM) automation suites, the possibility of managing process-related risks arises. This paper introduces an innovative framework for process-related risk management and describes a working implementation realized by extending the YAWL system. The framework covers three aspects of risk management: risk monitoring, risk prevention, and risk mitigation. Risk monitoring functionality is provided using a sensor-based architecture, where sensors are defined at design time and used at run-time for monitoring purposes. Risk prevention functionality is provided in the form of suggestions about what should be executed, by who, and how, through the use of decision trees. Finally, risk mitigation functionality is provided as a sequence of remedial actions (e.g. reallocating, skipping, rolling back of a work item) that should be executed to restore the process to a normal situation.
Resumo:
The Toolbox, combined with MATLAB ® and a modern workstation computer, is a useful and convenient environment for investigation of machine vision algorithms. For modest image sizes the processing rate can be sufficiently ``real-time'' to allow for closed-loop control. Focus of attention methods such as dynamic windowing (not provided) can be used to increase the processing rate. With input from a firewire or web camera (support provided) and output to a robot (not provided) it would be possible to implement a visual servo system entirely in MATLAB. Provides many functions that are useful in machine vision and vision-based control. Useful for photometry, photogrammetry, colorimetry. It includes over 100 functions spanning operations such as image file reading and writing, acquisition, display, filtering, blob, point and line feature extraction, mathematical morphology, homographies, visual Jacobians, camera calibration and color space conversion.