945 resultados para Functional Requirements for Authority Data (FRAD)


Relevância:

100.00% 100.00%

Publicador:

Resumo:

The software engineering community has paid little attention to non-functional requirements, or quality attributes, compared with studies performed on capture, analysis and validation of functional requirements. This circumstance becomes more intense in the case of distributed applications. In these applications we have to take into account, besides the quality attributes such as correctness, robustness, extendibility, reusability, compatibility, efficiency, portability and ease of use, others like reliability, scalability, transparency, security, interoperability, concurrency, etc. In this work we will show how these last attributes are related to different abstractions that coexist in the problem domain. To achieve this goal, we have established a taxonomy of quality attributes of distributed applications and have determined the set of necessary services to support such attributes.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

El siguiente trabajo abarca todo el proceso llevado a cabo para el rediseño de un sistema automático de tutoría que se integra con laboratorios virtuales desarrollados para la realización de prácticas por parte de estudiantes dentro de entornos virtuales tridimensionales. Los principales objetivos de este rediseño son la mejora del rendimiento del sistema automático de tutoría, haciéndolo más eficiente y por tanto permitiendo a un mayor número de estudiantes realizar una práctica al mismo tiempo. Además, este rediseño permitirá que el tutor se pueda integrar con otros motores gráficos con un coste relativamente bajo. Se realiza en primer lugar una introducción a los principales conceptos manejados en este trabajo así como algunos aspectos relacionados con trabajos previos a este rediseño del tutor automático, concretamente la versión anterior del tutor ligada a la plataforma OpenSim. Acto seguido se detallarán qué requisitos funcionales cumplirá así como las ventajas que aportará este nuevo diseño. A continuación, se explicará el desarrollo del trabajo donde se podrá ver cómo se ha reestructurado el antiguo sistema de tutoría, la aplicación de un diseño orientado a objetos y los distintos paquetes y clases que lo conforman. Por último, se detallarán las conclusiones obtenidas durante el desarrollo del trabajo así como la implicación del trabajo aquí mostrado en futuros desarrollos.---ABSTRACT--- The following work shows the process that has been carried out in order to redesign an automatic tutoring system that can be integrated into virtual laboratories developed for supporting students’ practices in 3D virtual environments. The main goals of this redesign are the improvement of automatic tutoring system performance, making it more efficient and therefore allowing more students to perform a practice at the same time. Furthermore, this redesign allows the tutor to be integrated with other graphic engines with a relative low cost. Firstly, we begin with an introduction to the main concepts used in this work and some aspects concerning the related previous works to this automatic tutoring system redesign, such as the previous version of the tutoring system bound to OpenSim. Secondly, it will be detailed what functional requirements are met and what advantages this new tutoring system will provide. Next, it will be explained how this work has been developed, how the previous tutoring system has been restructured, how an object-oriented design is applied and the classes and packages derived from this design. Finally, it will be outlined the conclusions drawn in the development of this work as well as how this work will take part in future works.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Federated clouds can expose the Internet as a homogeneous compute fabric. There is an opportunity for developing cross-cloud applications that can be deployed pervasively over the Internet, dynamically adapting their internal topology to their needs. In this paper we explore the main challenges for fully realizing the potential of cross-cloud applications. First, we focus on the networking dimension of these applications. We evaluate what support is needed from the infrastructure, and what are the further implications of opening the networking side. On a second part, we examine the impact of a distributed deployment for applications, assessing the implications from a management perspective, and how it affects the delivery of quality of service and non-functional requirements.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Business information has become a critical asset for companies and it has even more value when obtained and exploited in real time. This paper analyses how to integrate this information into an existing banking Enterprise Architecture, following an event-driven approach, and entails the study of three main issues: the definition of business events, the specification of a reference architecture, which identifies the specific integration points, and the description of a governance approach to manage the new elements. All the proposed solutions have been validated with a proof-of-concept test bed in an open source environment. It is based on a case study of the banking sector that allows an operational validation to be carried out, as well as ensuring compliance with non-functional requirements. We have focused these requirements on performance.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

This summary presents a methodology for supporting the development of AOSAs following the MDD paradigm. This new methodology is called PRISMA and allows the code generation from models which specify functional and non-functional requirements.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Los sistemas empotrados son cada día más comunes y complejos, de modo que encontrar procesos seguros, eficaces y baratos de desarrollo software dirigidos específicamente a esta clase de sistemas es más necesario que nunca. A diferencia de lo que ocurría hasta hace poco, en la actualidad los avances tecnológicos en el campo de los microprocesadores de los últimos tiempos permiten el desarrollo de equipos con prestaciones más que suficientes para ejecutar varios sistemas software en una única máquina. Además, hay sistemas empotrados con requisitos de seguridad (safety) de cuyo correcto funcionamiento depende la vida de muchas personas y/o grandes inversiones económicas. Estos sistemas software se diseñan e implementan de acuerdo con unos estándares de desarrollo software muy estrictos y exigentes. En algunos casos puede ser necesaria también la certificación del software. Para estos casos, los sistemas con criticidades mixtas pueden ser una alternativa muy valiosa. En esta clase de sistemas, aplicaciones con diferentes niveles de criticidad se ejecutan en el mismo computador. Sin embargo, a menudo es necesario certificar el sistema entero con el nivel de criticidad de la aplicación más crítica, lo que hace que los costes se disparen. La virtualización se ha postulado como una tecnología muy interesante para contener esos costes. Esta tecnología permite que un conjunto de máquinas virtuales o particiones ejecuten las aplicaciones con unos niveles de aislamiento tanto temporal como espacial muy altos. Esto, a su vez, permite que cada partición pueda ser certificada independientemente. Para el desarrollo de sistemas particionados con criticidades mixtas se necesita actualizar los modelos de desarrollo software tradicionales, pues estos no cubren ni las nuevas actividades ni los nuevos roles que se requieren en el desarrollo de estos sistemas. Por ejemplo, el integrador del sistema debe definir las particiones o el desarrollador de aplicaciones debe tener en cuenta las características de la partición donde su aplicación va a ejecutar. Tradicionalmente, en el desarrollo de sistemas empotrados, el modelo en V ha tenido una especial relevancia. Por ello, este modelo ha sido adaptado para tener en cuenta escenarios tales como el desarrollo en paralelo de aplicaciones o la incorporación de una nueva partición a un sistema ya existente. El objetivo de esta tesis doctoral es mejorar la tecnología actual de desarrollo de sistemas particionados con criticidades mixtas. Para ello, se ha diseñado e implementado un entorno dirigido específicamente a facilitar y mejorar los procesos de desarrollo de esta clase de sistemas. En concreto, se ha creado un algoritmo que genera el particionado del sistema automáticamente. En el entorno de desarrollo propuesto, se han integrado todas las actividades necesarias para desarrollo de un sistema particionado, incluidos los nuevos roles y actividades mencionados anteriormente. Además, el diseño del entorno de desarrollo se ha basado en la ingeniería guiada por modelos (Model-Driven Engineering), la cual promueve el uso de los modelos como elementos fundamentales en el proceso de desarrollo. Así pues, se proporcionan las herramientas necesarias para modelar y particionar el sistema, así como para validar los resultados y generar los artefactos necesarios para el compilado, construcción y despliegue del mismo. Además, en el diseño del entorno de desarrollo, la extensión e integración del mismo con herramientas de validación ha sido un factor clave. En concreto, se pueden incorporar al entorno de desarrollo nuevos requisitos no-funcionales, la generación de nuevos artefactos tales como documentación o diferentes lenguajes de programación, etc. Una parte clave del entorno de desarrollo es el algoritmo de particionado. Este algoritmo se ha diseñado para ser independiente de los requisitos de las aplicaciones así como para permitir al integrador del sistema implementar nuevos requisitos del sistema. Para lograr esta independencia, se han definido las restricciones al particionado. El algoritmo garantiza que dichas restricciones se cumplirán en el sistema particionado que resulte de su ejecución. Las restricciones al particionado se han diseñado con una capacidad expresiva suficiente para que, con un pequeño grupo de ellas, se puedan expresar la mayor parte de los requisitos no-funcionales más comunes. Las restricciones pueden ser definidas manualmente por el integrador del sistema o bien pueden ser generadas automáticamente por una herramienta a partir de los requisitos funcionales y no-funcionales de una aplicación. El algoritmo de particionado toma como entradas los modelos y las restricciones al particionado del sistema. Tras la ejecución y como resultado, se genera un modelo de despliegue en el que se definen las particiones que son necesarias para el particionado del sistema. A su vez, cada partición define qué aplicaciones deben ejecutar en ella así como los recursos que necesita la partición para ejecutar correctamente. El problema del particionado y las restricciones al particionado se modelan matemáticamente a través de grafos coloreados. En dichos grafos, un coloreado propio de los vértices representa un particionado del sistema correcto. El algoritmo se ha diseñado también para que, si es necesario, sea posible obtener particionados alternativos al inicialmente propuesto. El entorno de desarrollo, incluyendo el algoritmo de particionado, se ha probado con éxito en dos casos de uso industriales: el satélite UPMSat-2 y un demostrador del sistema de control de una turbina eólica. Además, el algoritmo se ha validado mediante la ejecución de numerosos escenarios sintéticos, incluyendo algunos muy complejos, de más de 500 aplicaciones. ABSTRACT The importance of embedded software is growing as it is required for a large number of systems. Devising cheap, efficient and reliable development processes for embedded systems is thus a notable challenge nowadays. Computer processing power is continuously increasing, and as a result, it is currently possible to integrate complex systems in a single processor, which was not feasible a few years ago.Embedded systems may have safety critical requirements. Its failure may result in personal or substantial economical loss. The development of these systems requires stringent development processes that are usually defined by suitable standards. In some cases their certification is also necessary. This scenario fosters the use of mixed-criticality systems in which applications of different criticality levels must coexist in a single system. In these cases, it is usually necessary to certify the whole system, including non-critical applications, which is costly. Virtualization emerges as an enabling technology used for dealing with this problem. The system is structured as a set of partitions, or virtual machines, that can be executed with temporal and spatial isolation. In this way, applications can be developed and certified independently. The development of MCPS (Mixed-Criticality Partitioned Systems) requires additional roles and activities that traditional systems do not require. The system integrator has to define system partitions. Application development has to consider the characteristics of the partition to which it is allocated. In addition, traditional software process models have to be adapted to this scenario. The V-model is commonly used in embedded systems development. It can be adapted to the development of MCPS by enabling the parallel development of applications or adding an additional partition to an existing system. The objective of this PhD is to improve the available technology for MCPS development by providing a framework tailored to the development of this type of system and by defining a flexible and efficient algorithm for automatically generating system partitionings. The goal of the framework is to integrate all the activities required for developing MCPS and to support the different roles involved in this process. The framework is based on MDE (Model-Driven Engineering), which emphasizes the use of models in the development process. The framework provides basic means for modeling the system, generating system partitions, validating the system and generating final artifacts. The framework has been designed to facilitate its extension and the integration of external validation tools. In particular, it can be extended by adding support for additional non-functional requirements and support for final artifacts, such as new programming languages or additional documentation. The framework includes a novel partitioning algorithm. It has been designed to be independent of the types of applications requirements and also to enable the system integrator to tailor the partitioning to the specific requirements of a system. This independence is achieved by defining partitioning constraints that must be met by the resulting partitioning. They have sufficient expressive capacity to state the most common constraints and can be defined manually by the system integrator or generated automatically based on functional and non-functional requirements of the applications. The partitioning algorithm uses system models and partitioning constraints as its inputs. It generates a deployment model that is composed by a set of partitions. Each partition is in turn composed of a set of allocated applications and assigned resources. The partitioning problem, including applications and constraints, is modeled as a colored graph. A valid partitioning is a proper vertex coloring. A specially designed algorithm generates this coloring and is able to provide alternative partitions if required. The framework, including the partitioning algorithm, has been successfully used in the development of two industrial use cases: the UPMSat-2 satellite and the control system of a wind-power turbine. The partitioning algorithm has been successfully validated by using a large number of synthetic loads, including complex scenarios with more that 500 applications.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Leguminous plants in symbiosis with rhizobia form either indeterminate nodules with a persistent meristem or determinate nodules with a transient meristematic region. Sesbania rostrata was thought to possess determinate stem and root nodules. However, the nature of nodule development is hybrid, and the early stages resemble those of indeterminate nodules. Here we show that, depending on the environmental conditions, mature root nodules can be of the indeterminate type. In situ hybridizations with molecular markers for plant cell division, as well as the patterns of bacterial nod and nif gene expression, confirmed the indeterminate nature of 30-day-old functional root nodules. Experimental data provide evidence that the switch in nodule type is mediated by the plant hormone ethylene.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The induced expression of c-Myc in plasmacytomas in BALB/c mice is regularly associated with nonrandom chromosomal translocations that juxtapose the c-myc gene to one of the Ig loci on chromosome 12 (IgH), 6 (IgK), or 16 (IgL). The DCPC21 plasmacytoma belongs to a small group of plasmacytomas that are unusual in that they appear to be translocation-negative. In this paper, we show the absence of any c-myc-activating chromosomal translocation for the DCPC21 by using fluorescent in situ hybridization, chromosome painting, and spectral karyotyping. We find that DCPC21 harbors c-myc and IgH genes on extrachromosomal elements (EEs) from which c-myc is transcribed, as shown by c-myc mRNA tracks and extrachromosomal gene transfer experiments. The transcriptional activity of these EEs is supported further by the presence of the transcription-associated phosphorylation of histone H3 (H3P) on the EEs. Thus, our data suggest that in this plasmacytoma, c-Myc expression is achieved by an alternative mechanism. The expression of the c-Myc oncoprotein is initiated outside the chromosomal locations of the c-myc gene, i.e., from EEs, which can be considered functional genetic units. Our data also imply that other “translocation-negative” experimental and human tumors with fusion transcripts or oncogenic activation may indeed carry translocation(s), however, in an extrachromosomal form.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Recruitment of intracellular proteins to the plasma membrane is a commonly found requirement for the initiation of signal transduction events. The recently discovered pleckstrin homology (PH) domain, a structurally conserved element found in ∼100 signaling proteins, has been implicated in this function, because some PH domains have been described to be involved in plasma membrane association. Furthermore, several PH domains bind to the phosphoinositides phosphatidylinositol-(4,5)-bisphosphate and phosphatidylinositol-(3,4,5)-trisphosphate in vitro, however, mostly with low affinity. It is unclear how such weak interactions can be responsible for observed membrane binding in vivo as well as the resulting biological phenomena. Here, we investigate the structural and functional requirements for membrane association of cytohesin-1, a recently discovered regulatory protein of T cell adhesion. We demonstrate that both the PH domain and the adjacent carboxyl-terminal polybasic sequence of cytohesin-1 (c domain) are necessary for plasma membrane association and biological function, namely interference with Jurkat cell adhesion to intercellular adhesion molecule 1. Biosensor measurements revealed that phosphatidylinositol-(3,4,5)-trisphosphate binds to the PH domain and c domain together with high affinity (100 nM), whereas the isolated PH domain has a substantially lower affinity (2–3 μM). The cooperativity of both elements appears specific, because a chimeric protein, consisting of the c domain of cytohesin-1 and the PH domain of the β-adrenergic receptor kinase does not associate with membranes, nor does it inhibit adhesion. Moreover, replacement of the c domain of cytohesin-1 with a palmitoylation–isoprenylation motif partially restored the biological function, but the specific targeting to the plasma membrane was not retained. Thus we conclude that two elements of cytohesin-1, the PH domain and the c domain, are required and sufficient for membrane association. This appears to be a common mechanism for plasma membrane targeting of PH domains, because we observed a similar functional cooperativity of the PH domain of Bruton’s tyrosine kinase with the adjacent Bruton’s tyrosine kinase motif, a novel zinc-containing fold.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Os sistemas interativos estão cada vez mais presentes como recursos tecnológicos utilizados pelo homem, sendo que para que a interação ocorra é necessária uma adaptação destes aparatos às reais necessidades do homem. Para garantir a qualidade de interação é preciso focar no princípio da usabilidade do sistema, desenvolvido por Jakob Nielsen, (1994) aprimorando com isso aspectos de acessibilidade, flexibilidade e eficiência no uso, para que o recurso tecnológico torne-se um objeto de modificação nesta relação. Objetivo: desenvolver um painel interativo utilizando a tecnologia Kinect e com isso fornecer informações sobre autocuidado e prevenção de incapacidades da hanseníase para pacientes e profissionais da saúde. Metodologia: Está baseada no modelo consensual que propõe uma solução para o problema de projeto e apresenta-se dividida em quatro fases: (1) projeto informacional; (2) projeto conceitual; (3) projeto preliminar e (4) projeto detalhado. Resultados: foi produzido um protótipo contendo imagens, texto e vídeos com informações sobre a Hanseníase. Este é composto por material coletado nos manuais produzidos pelo Ministério da Saúde para orientação de cuidados na Hanseníase e um vídeo inserido para demonstrar como seria o acesso a este recurso, acessados por meio dos movimentos dos membros superiores no qual a pessoa posiciona-se, em frente ao painel, a uma distância de 80 cm, e seleciona o que deseja ver com uma das mãos, que se torna a \"mão virtual\" movida na tela e seleciona o material instrucional. Os requisitos funcionais e não funcionais foram organizados contendo a caracterização das imagens de forma legível e nítida e opções de textos buscando a compreensão e o acesso da população. Foram desenvolvidos 16 vídeos que ensinam como realizar os exercícios para prevenir incapacidades e possíveis deformidades, estimulando assim o autocuidado. Conclusão: O desenvolvimento de material educacional sobre a Hanseníase que utilize as novas tecnologias é escasso e pouco explorado pelos profissionais da reabilitação na hanseníase. Investir em ações que visem tornar a pessoa mais informada sobre sua doença e segura sobre seu tratamento, pode contribuir para a autonomia em parte dos cuidados e as novas tecnologias podem funcionar como importante aliado neste processo.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Architectural decisions are often encoded in the form of constraints and guidelines. Non-functional requirements can be ensured by checking the conformance of the implementation against this kind of invariant. Conformance checking is often a costly and error-prone process that involves the use of multiple tools, differing in effectiveness, complexity and scope of applicability. To reduce the overall effort entailed by this activity, we propose a novel approach that supports verification of human- readable declarative rules through the use of adapted off-the-shelf tools. Our approach consists of a rule specification DSL, called Dicto, and a tool coordination framework, called Probo. The approach has been implemented in a soon to be evaluated prototype.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

This paper discusses how the AustLit: Australian Literature Gateway's interpretation, enhancement, and implementation of the International Federation of Library Associations and Institutions' Functional Requirements for Bibliographic Records (FRBR Final Report 1998) model is meeting the needs of Australian literature scholars for accurate bibliographic representation of the histories of literary texts. It also explores how the AustLit Gateway's underpinning research principles, which are based on the tradition of scholarly enumerative and descriptive bibliography, with enhancements from analytical bibliography and literary biography, have impacted upon our implementation of the FRBR model. The major enhancement or alteration to the model is the use of enhanced manifestations, which allow the full representation of all agents' contributions to be shown in a highly granular format by enabling creation events to be incorporated at all levels of the Work, Expression, and Manifestation nexus.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Verbal working memory and emotional self-regulation are impaired in Bipolar Disorder (BD). Our aim was to investigate the effect of Lamotrigine (LTG), which is effective in the clinical management of BD, on the neural circuits subserving working memory and emotional processing. Functional Magnetic Resonance Imaging data from 12 stable BD patients was used to detect LTG-induced changes as the differences in brain activity between drug-free and post-LTG monotherapy conditions during a verbal working memory (N-back sequential letter task) and an angry facial affect recognition task. For both tasks, LGT monotherapy compared to baseline was associated with increased activation mostly within the prefrontal cortex and cingulate gyrus, in regions normally engaged in verbal working memory and emotional processing. Therefore, LTG monotherapy in BD patients may enhance cortical function within neural circuits involved in memory and emotional self-regulation. © 2007 Elsevier B.V. and ECNP.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Having arisen in large corporations, six sigma is surely one of the most comprehensive approaches for company development and performance improvement of products and processes. Nevertheless, it appears that the majority of small and medium-sized enterprises (SMEs) either do not know the six sigma approach, or find its organisation not suitable to meet their specific requirements. This study identifies the specific requirements based on a sample of SMEs in Germany and examines how six sigma has to be modified to be applicable and valuable in an SME environment. The overall results are reflected in ten imperative functional requirements for an adjusted approach. © Emerald Group Publishing Limited.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Purpose – The data used in this study is for the period 1980-2000. Almost midway through this period (in 1992), the Kenyan government liberalized the sugar industry and the role of the market increased, while the government's role with respect to control of prices, imports and other aspects in the sector declined. This exposed the local sugar manufacturers to external competition from other sugar producers, especially from the COMESA region. This study aims to find whether there were any changes in efficiency of production between the two periods (pre and post-liberalization). Design/methodology/approach – The study utilized two methodologies to efficiency estimation: data envelopment analysis (DEA) and the stochastic frontier. DEA uses mathematical programming techniques and does not impose any functional form on the data. However, it attributes all deviation from the mean function to inefficiencies. The stochastic frontier utilizes econometric techniques. Findings – The test for structural differences in the two periods does not show any statistically significant differences between the two periods. However, both methodologies show a decline in efficiency levels from 1992, with the lowest period experienced in 1998. From then on, efficiency levels began to increase. Originality/value – To the best of the authors' knowledge, this is the first paper to use both methodologies in the sugar industry in Kenya. It is shown that in industries where the noise (error) term is minimal (such as manufacturing), the DEA and stochastic frontier give similar results.