22 resultados para Requirements engineering process

em Universidade do Minho


Relevância:

100.00% 100.00%

Publicador:

Resumo:

When representing the requirements for an intended software solution during the development process, a logical architecture is a model that provides an organized vision of how functionalities behave regardless of the technologies to be implemented. If the logical architecture represents an ambient assisted living (AAL) ecosystem, such representation is a complex task due to the existence of interrelated multidomains, which, most of the time, results in incomplete and incoherent user requirements. In this chap- ter, we present the results obtained when applying process-level modeling techniques to the derivation of the logical architecture for a real industrial AAL project. We adopt a V-Model–based approach that expresses the AAL requirements in a process-level perspec- tive, instead of the traditional product-level view. Additionally, we ensure compliance of the derived logical architecture with the National Institute of Standards and Technology (NIST) reference architecture as nonfunctional requirements to support the implementa- tion of the AAL architecture in cloud contexts.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

Dissertação de mestrado integrado em Engenharia de Telecomunicações e Informática

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Developing and implementing data-oriented workflows for data migration processes are complex tasks involving several problems related to the integration of data coming from different schemas. Usually, they involve very specific requirements - every process is almost unique. Having a way to abstract their representation will help us to better understand and validate them with business users, which is a crucial step for requirements validation. In this demo we present an approach that provides a way to enrich incrementally conceptual models in order to support an automatic way for producing their correspondent physical implementation. In this demo we will show how B2K (Business to Kettle) system works transforming BPMN 2.0 conceptual models into Kettle data-integration executable processes, approaching the most relevant aspects related to model design and enrichment, model to system transformation, and system execution.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Software product lines (SPL) are diverse systems that are developed using a dual engineering process: (a)family engineering defines the commonality and variability among all members of the SPL, and (b) application engineering derives specific products based on the common foundation combined with a variable selection of features. The number of derivable products in an SPL can thus be exponential in the number of features. This inherent complexity poses two main challenges when it comes to modelling: Firstly, the formalism used for modelling SPLs needs to be modular and scalable. Secondly, it should ensure that all products behave correctly by providing the ability to analyse and verify complex models efficiently. In this paper we propose to integrate an established modelling formalism (Petri nets) with the domain of software product line engineering. To this end we extend Petri nets to Feature Nets. While Petri nets provide a framework for formally modelling and verifying single software systems, Feature Nets offer the same sort of benefits for software product lines. We show how SPLs can be modelled in an incremental, modular fashion using Feature Nets, provide a Feature Nets variant that supports modelling dynamic SPLs, and propose an analysis method for SPL modelled as Feature Nets. By facilitating the construction of a single model that includes the various behaviours exhibited by the products in an SPL, we make a significant step towards efficient and practical quality assurance methods for software product lines.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Genome-scale metabolic models are valuable tools in the metabolic engineering process, based on the ability of these models to integrate diverse sources of data to produce global predictions of organism behavior. At the most basic level, these models require only a genome sequence to construct, and once built, they may be used to predict essential genes, culture conditions, pathway utilization, and the modifications required to enhance a desired organism behavior. In this chapter, we address two key challenges associated with the reconstruction of metabolic models: (a) leveraging existing knowledge of microbiology, biochemistry, and available omics data to produce the best possible model; and (b) applying available tools and data to automate the reconstruction process. We consider these challenges as we progress through the model reconstruction process, beginning with genome assembly, and culminating in the integration of constraints to capture the impact of transcriptional regulation. We divide the reconstruction process into ten distinct steps: (1) genome assembly from sequenced reads; (2) automated structural and functional annotation; (3) phylogenetic tree-based curation of genome annotations; (4) assembly and standardization of biochemistry database; (5) genome-scale metabolic reconstruction; (6) generation of core metabolic model; (7) generation of biomass composition reaction; (8) completion of draft metabolic model; (9) curation of metabolic model; and (10) integration of regulatory constraints. Each of these ten steps is documented in detail.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

Tese de Doutoramento em Tecnologias e Sistemas de Informação

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Tendon tissue engineering (TE) requires tailoring scaffolds designs and properties to the anatomical and functional requirements of tendons located in different regions of the body. Cell sourcing is also of utmost importance as tendon cells are scarce. Recently, we have found that it is possible to direct the tenogenic differentiation of Amniotic fluid and Adipose tissue derived stem cells (hAFSCs and hASCs), and also that there are hASCs subpopulations that might be more prone to tenogenic differentiation. Nevertheless, biochemical stimulation may not be enough to develop functional TE substitutes for a tissue that is known to be highly dependent on mechanical loading.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This paper presents a framework of competences developed for Industrial Engineering and Management that can be used as a tool for curriculum analysis and design, including the teaching and learning processes as well as the alignment of the curriculum with the professional profile. The framework was applied to the Industrial Engineering and Management program at University of Minho (UMinho), Portugal, and it provides an overview of the connection between IEM knowledge areas and the competences defined in its curriculum. The framework of competences was developed through a process of analysis using a combination of methods and sources for data collection. The framework was developed according to four main steps: 1) characterization of IEM knowledge areas; 2) definition of IEM competences; 3) survey; 4) application of the framework at the IEM curriculum. The findings showed that the framework is useful to build an integrated vision of the curriculum. The most visible aspect in the learning outcomes of IEM program is the lack of balance between technical and transversal competences. There was not almost any reference to the transversal competences and it is fundamentally concentrated on Project-Based Learning courses. The framework presented in this paper provides a contribution to the definition of IEM professional profile through a set of competences which need to be explored further. In addition, it may be a relevant tool for IEM curriculum analysis and a contribution for bridging the gap between universities and companies.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This paper presents a proposal for a management model based on reliability requirements concerning Cloud Computing (CC). The proposal was based on a literature review focused on the problems, challenges and underway studies related to the safety and reliability of Information Systems (IS) in this technological environment. This literature review examined the existing obstacles and challenges from the point of view of respected authors on the subject. The main issues are addressed and structured as a model, called "Trust Model for Cloud Computing environment". This is a proactive proposal that purposes to organize and discuss management solutions for the CC environment, aiming improved reliability of the IS applications operation, for both providers and their customers. On the other hand and central to trust, one of the CC challenges is the development of models for mutual audit management agreements, so that a formal relationship can be established involving the relevant legal responsibilities. To establish and control the appropriate contractual requirements, it is necessary to adopt technologies that can collect the data needed to inform risk decisions, such as access usage, security controls, location and other references related to the use of the service. In this process, the cloud service providers and consumers themselves must have metrics and controls to support cloud-use management in compliance with the SLAs agreed between the parties. The organization of these studies and its dissemination in the market as a conceptual model that is able to establish parameters to regulate a reliable relation between provider and user of IT services in CC environment is an interesting instrument to guide providers, developers and users in order to provide services and secure and reliable applications.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The performance of parts produced by Free Form Extrusion (FFE), an increasingly popular additive manufacturing technique, depends mainly on their dimensional accuracy, surface quality and mechanical performance. These attributes are strongly influenced by the evolution of the filament temperature and deformation during deposition and solidification. Consequently, the availability of adequate process modelling software would offer a powerful tool to support efficient process set-up and optimisation. This work examines the contribution to the overall heat transfer of various thermal phenomena developing during the manufacturing sequence, including convection and radiation with the environment, conduction with support and between adjacent filaments, radiation between adjacent filaments and convection with entrapped air. The magnitude of the mechanical deformation is also studied. Once this exercise is completed, it is possible to select the material properties, process variables and thermal phenomena that should be taken in for effective numerical modelling of FFE.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Children are an especially vulnerable population, particularly in respect to drug administration. It is estimated that neonatal and pediatric patients are at least three times more vulnerable to damage due to adverse events and medication errors than adults are. With the development of this framework, it is intended the provision of a Clinical Decision Support System based on a prototype already tested in a real environment. The framework will include features such as preparation of Total Parenteral Nutrition prescriptions, table pediatric and neonatal emergency drugs, medical scales of morbidity and mortality, anthropometry percentiles (weight, length/height, head circumference and BMI), utilities for supporting medical decision on the treatment of neonatal jaundice and anemia and support for technical procedures and other calculators and widespread use tools. The solution in development means an extension of INTCare project. The main goal is to provide an approach to get the functionality at all times of clinical practice and outside the hospital environment for dissemination, education and simulation of hypothetical situations. The aim is also to develop an area for the study and analysis of information and extraction of knowledge from the data collected by the use of the system. This paper presents the architecture, their requirements and functionalities and a SWOT analysis of the solution proposed.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Noise affects people in very different aspects and in almost every aspect of our daily life. The most prominent impact of noise exposure is hearing loss. However, it can also impair people at their work settings due to other effects rather than hearing loss. Older works tend to be more susceptible to noise exposure effects at work, firstly because most of them already have some ‘natural’ hearing loss, as a results of the ageing process, and secondly because they also tend to be more susceptible at an psychological level. The current study is an attempt to describe the potential problem and to make a survey to identify the available active noise cancelation systems, as well as to specific the main requirements of this type of systems to be applied in such contexts. Several aspects of characteristics of the ANC systems were identified and are presented in this study. From the obtained results it was possible to have a clearer idea about the potential of this technology, and to confirm that this type of solution can be extremely important as a component of an active ageing program, as the preservation of hearing will also impact on the social life of the exposed workers.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Tese de Doutoramento em Ciências - Especialidade em Biologia

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Tese de Doutoramento - Civil Engineering

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The monitoring data collected during tunnel excavation can be used in inverse analysis procedures in order to identify more realistic geomechanical parameters that can increase the knowledge about the interested formations. These more realistic parameters can be used in real time to adapt the project to the real structure in situ behaviour. However, monitoring plans are normally designed for safety assessment and not especially for the purpose of inverse analysis. In fact, there is a lack of knowledge about what types and quantity of measurements are needed to succeed in identifying the parameters of interest. Also, the optimisation algorithm chosen for the identification procedure may be important for this matter. In this work, this problem is addressed using a theoretical case with which a thorough parametric study was carried out using two optimisation algorithms based on different calculation paradigms, namely a conventional gradient-based algorithm and an evolution strategy algorithm. Calculations were carried for different sets of parameters to identify several combinations of types and amount of monitoring data. The results clearly show the high importance of the available monitoring data and the chosen algorithm for the success rate of the inverse analysis process.