981 resultados para Requirements elicitation techniques
Resumo:
Large enterprises have for many years employed eBusiness solutions in order to improve their efficiency. Smaller companies, however, have not been able to leverage these technologies due to the high level of know-how and resources required in implementing them. To solve this, novel software services are being developed to facilitate eBusiness adoption for the small enterprise with the aim of making B2Bi feasible not only between large organisations but also between trading partners of all sizes. The objective of this study was to find what standards and techniques on eBusiness and software testing and quality assurance fit best for building these new kinds of software considering the requirements their unique eBusiness approach poses. The research was conducted as a literature study with focus on standards on software testing and quality assurance together with standards on eBusiness. The study showed that the current software testing and quality assurance standards do not possess such characteristics as would make select standards evidently better fitted for building this type of software, which were established to be best developed as web services in order for them to meet their requirements. A selection of eBusiness standards and technologies was proposed to support this approach. The main finding in the study was, however, that these kinds of web services that have high interoperability requirements will have to be able to carry out automated interoperability and conformance testing as part of their operation; this objective dictates how the software are built and how testing during software development is to be done. The study showed that research on automated interoperability and conformance testing for web services is still limited and more research is needed to make the building of highly-interoperable web services more feasible.
Resumo:
The design methods and languages targeted to modern System-on-Chip designs are facing tremendous pressure of the ever-increasing complexity, power, and speed requirements. To estimate any of these three metrics, there is a trade-off between accuracy and abstraction level of detail in which a system under design is analyzed. The more detailed the description, the more accurate the simulation will be, but, on the other hand, the more time consuming it will be. Moreover, a designer wants to make decisions as early as possible in the design flow to avoid costly design backtracking. To answer the challenges posed upon System-on-chip designs, this thesis introduces a formal, power aware framework, its development methods, and methods to constraint and analyze power consumption of the system under design. This thesis discusses on power analysis of synchronous and asynchronous systems not forgetting the communication aspects of these systems. The presented framework is built upon the Timed Action System formalism, which offer an environment to analyze and constraint the functional and temporal behavior of the system at high abstraction level. Furthermore, due to the complexity of System-on-Chip designs, the possibility to abstract unnecessary implementation details at higher abstraction levels is an essential part of the introduced design framework. With the encapsulation and abstraction techniques incorporated with the procedure based communication allows a designer to use the presented power aware framework in modeling these large scale systems. The introduced techniques also enable one to subdivide the development of communication and computation into own tasks. This property is taken into account in the power analysis part as well. Furthermore, the presented framework is developed in a way that it can be used throughout the design project. In other words, a designer is able to model and analyze systems from an abstract specification down to an implementable specification.
Resumo:
Cognitive radio networks sense spectrum occupancy and manage themselvesto operate in unused bands without disturbing licensed users. The detection capability of aradio system can be enhanced if the sensing process is performed jointly by a group of nodesso that the effects of wireless fading and shadowing can be minimized. However, taking acollaborative approach poses new security threats to the system as nodes can report falsesensing data to reach a wrong decision. This paper makes a review of secure cooperativespectrum sensing in cognitive radio networks. The main objective of these protocols is toprovide an accurate resolution about the availability of some spectrum channels, ensuring thecontribution from incapable users as well as malicious ones is discarded. Issues, advantagesand disadvantages of such protocols are investigated and summarized.
Resumo:
The ongoing development of the digital media has brought a new set of challenges with it. As images containing more than three wavelength bands, often called spectral images, are becoming a more integral part of everyday life, problems in the quality of the RGB reproduction from the spectral images have turned into an important area of research. The notion of image quality is often thought to comprise two distinctive areas – image quality itself and image fidelity, both dealing with similar questions, image quality being the degree of excellence of the image, and image fidelity the measure of the match of the image under study to the original. In this thesis, both image fidelity and image quality are considered, with an emphasis on the influence of color and spectral image features on both. There are very few works dedicated to the quality and fidelity of spectral images. Several novel image fidelity measures were developed in this study, which include kernel similarity measures and 3D-SSIM (structural similarity index). The kernel measures incorporate the polynomial, Gaussian radial basis function (RBF) and sigmoid kernels. The 3D-SSIM is an extension of a traditional gray-scale SSIM measure developed to incorporate spectral data. The novel image quality model presented in this study is based on the assumption that the statistical parameters of the spectra of an image influence the overall appearance. The spectral image quality model comprises three parameters of quality: colorfulness, vividness and naturalness. The quality prediction is done by modeling the preference function expressed in JNDs (just noticeable difference). Both image fidelity measures and the image quality model have proven to be effective in the respective experiments.
Resumo:
The water content in seafoods is very important since it affects their sensorial quality, microbiological stability, physical characteristics and shelf life. In this study, thermoanalytical techniques were employed to develop a simple and accurate method to determine water content (moisture) by thermogravimetry (TG) and water activity from moisture content values and freezing point depression using differential scanning calorimetry (DSC). The precision of the results suggests that TG is a suitable technique to determine moisture content in biological samples. The average water content values for fish samples of Lutjanus synagris and Ocyurus chrysurus species were 76.4 ± 5.7% and 63.3 ± 3.9%, respectively, while that of Ulva lactuca marine algae species was 76.0 ± 4.4%. The method presented here was also successfully applied to determine water activity in two species of fish and six species of marine algae collected in the Atlantic coastal waters of Bahia, in Brazil. Water activity determined in fish samples ranged from 0.946 - 0.960 and was consistent with values reported in the literature, i.e., 0.9 - 1.0. The water activity values determined in marine algae samples lay within the interval of 0.974 - 0.979.
Resumo:
Building industry is a high volume branch which could provide prominent markets for wood based interior decoration solutions. Competition in interior decoration markets requires versatility in appearance. Versatility in wood appearance and added value could be achieved by printing grain patterns of different species or images directly onto wood. The problem when planning wood printing’s implementing into durable applications is basically how to transfer a high quality image or print sustainably onto wood, which is porous, heterogeneous, dimensionally unstable, non-white and rough. Wood preservation or treating, and modification can provide durability against degradation but also effect to the surface properties of wood which will effect on printability. Optimal adhesion is essential into print quality, as too high ink absorbance can cause spreading and too low ink absorbance cause pale prints. Different printing techniques have different requirements on materials and production. The direct printing on wood means, that intermedias are not used. Printing techniques with flexible printing plates or in fact non-impact techniques provide the best basis for wood printing. Inkjet printing of wood with different mechanical or chemical surface treatments, and wood plastic composite material gave good results that encourage further studies of the subject. Sanding the wood surface anti-parallel to the grain gave the best overall printing quality. Spreading parallel to the grain could not be avoided totally, except in cases where wood was treated hydrophobic so adhesion of the ink was not sufficient. Grain pattern of the underlying wood stays clearly visible in the printed images. Further studies should be made to fine tune the methods that already gave good results. Also effects of moisture content of wood, different inks, and long-term exposure to UV-radiation should be tested.
Resumo:
The front end of innovation is regarded as one of the most important steps in building new software products or services, and the most significant benefits in software development can be achieved through improvements in the front end activities. Problems in the front end phase have an impact on customer dissatisfaction with delivered software, and on the effectiveness of the entire software development process. When these processes are improved, the likelihood of delivering high quality software and business success increases. This thesis highlights the challenges and problems related to the early phases of software development, and provides new methods and tools for improving performance in the front end activities of software development. The theoretical framework of this study comprises two fields of research. The first section belongs to the field of innovation management, and especially to the management of the early phases of the innovation process, i.e. the front end of innovation. The second section of the framework is closely linked to the processes of software engineering, especially to the early phases of the software development process, i.e. the practice of requirements engineering. Thus, this study extends the theoretical knowledge and discloses the differences and similarities in these two fields of research. In addition, this study opens up a new strand for academic discussion by connecting these research directions. Several qualitative business research methodologies have been utilized in the individual publications to solve the research questions. The theoretical and managerial contribution of the study can be divided into three areas: 1) processes and concepts, 2) challenges and development needs, and 3) means and methods for the front end activities of software development. First, the study discloses the difference and similarities between the concepts of the front end of innovation and requirements engineering, and proposes a new framework for managing the front end of the software innovation process, bringing business and innovation perspectives into software development. Furthermore, the study discloses managerial perceptions of the similarities and differences in the concept of the front end of innovation between the software industry and the traditional industrial sector. Second, the study highlights the challenges and development needs in the front end phase of software development, especially challenges in communication, such as linguistic problems, ineffective communication channels, a communication gap between users/customers and software developers, and participation of multiple persons in software development. Third, the study proposes new group methods for improving the front end activities of software development, especially customer need assessment, and the elicitation of software requirements.
Resumo:
Two spectrophotometric methods are described for the simultaneous determination of ezetimibe (EZE) and simvastatin (SIM) in pharmaceutical preparations. The obtained data was evaluated by using two different chemometric techniques, Principal Component Regression (PCR) and Partial Least-Squares (PLS-1). In these techniques, the concentration data matrix was prepared by using the mixtures containing these drugs in methanol. The absorbance data matrix corresponding to the concentration data matrix was obtained by the measurements of absorbances in the range of 240 - 300 nm in the intervals with Δλ = 1 nm at 61 wavelengths in their zero order spectra, then, calibration or regression was obtained by using the absorbance data matrix and concentration data matrix for the prediction of the unknown concentrations of EZE and SIM in their mixture. The procedure did not require any separation step. The linear range was found to be 5 - 20 µg mL-1 for EZE and SIM in both methods. The accuracy and precision of the methods were assessed. These methods were successfully applied to a pharmaceutical preparation, tablet; and the results were compared with each other.
Resumo:
Here we investigate the formation of superficial micro- and nanostructures in poly(ethylene-2,6-naphthalate) (PEN), with a view to their use in biomedical device applications, and compare its performance with a polymer commonly used for the fabrication of these devices, poly(methyl methacrylate) (PMMA). The PEN is found to replicate both micro- and nanostructures in its surface, albeit requiring more forceful replication conditions than PMMA, producing a slight increase in surface hydrophilicity. This ability to form micro/nanostructures, allied to biocompatibility and good optical transparency, suggests that PEN could be a useful material for production of, or for incorporation into, transparent devices for biomedical applications. Such devices will be able to be autoclaved, due to the polymer's high temperature stability, and will be useful for applications where forceful experimental conditions are required, due to a superior chemical resistance over PMMA.
Resumo:
This article explores the possibilities offered by visual methods in the move towards inclusive research, reviewing some methodological implications of said research and reflecting on the potential of visual methods to meet these methodological requirements. A study into the impact of work on social inclusion and the social relationships of people suffering from severe mental illness (SMI) serves to illustrate the use of visual methods such as photo elicitation and graphic elicitation in the context of in-depth interviews with the aim of improving the aforementioned target group’s participation in research, participation understood as one of the basic elements of inclusive approaches. On the basis of this study, we reflect on the potential of visual methods to improve the inclusive approach to research and conclude that these methods are open and flexible in awarding participantsa voice, allowingpeople with SMI to express their needs, and therefore adding value to said approach
Resumo:
A study of the partial USEPA 3050B and total ISO 14869-1:2001 digestion methods of sediments was performed. USEPA 3050B was recommended as the simpler method with less operational risk. However, the extraction ability of the method should be taken in account for the best environmental interpretation of the results. FAAS was used to quantify metal concentrations in sediment solutions. The alternative use of ICP-OES quantification should be conditioned by a previous detailed investigation and eventual correction of the matrix effect. For the first time, the EID method was employed for the detection and correction of the matrix effect in sediment ICP-OES analysis. Finally, some considerations were made about the level of metal contamination in the area under study.
Resumo:
The importance of medicinal plants and their use in industrial applications is increasing worldwide, especially in Brazil. Phyllanthus species, popularly known as "quebra-pedras" in Brazil, are used in folk medicine for treating urinary infections and renal calculus. This paper reports an authenticity study, based on herbal drugs from Phyllanthus species, involving commercial and authentic samples using spectroscopic techniques: FT-IR, ¹H HR-MAS NMR and ¹H NMR in solution, combined with chemometric analysis. The spectroscopic techniques evaluated, coupled with chemometric methods, have great potential in the investigation of complex matrices. Furthermore, several metabolites were identified by the NMR techniques.
Resumo:
We propose an analytical method based on fourier transform infrared-attenuated total reflectance (FTIR-ATR) spectroscopy to detect the adulteration of petrodiesel and petrodiesel/palm biodiesel blends with African crude palm oil. The infrared spectral fingerprints from the sample analysis were used to perform principal components analysis (PCA) and to construct a prediction model using partial least squares (PLS) regression. The PCA results separated the samples into three groups, allowing identification of those subjected to adulteration with palm oil. The obtained model shows a good predictive capacity for determining the concentration of palm oil in petrodiesel/biodiesel blends. Advantages of the proposed method include cost-effectiveness and speed; it is also environmentally friendly.