843 resultados para Framework development


Relevância:

30.00% 30.00%

Publicador:

Resumo:

El proyecto se trata de una API de desarrollo para el DNI electrónico que permita crear de forma sencilla aplicaciones cuya funcionalidad se apoye en el uso del DNI electrónico. De esta forma, el framework facilita el acceso a las principales operaciones soportadas por el DNIe mediante la invocación de métodos sencillos. Una de las funcionalidades es la de realizar un proceso de autenticación con el DNIe utilizando para ello las capacidades criptográficas del chip que incorpora y el certificado de autenticación. Esta funcionalidad puede ser accedida también de forma dividida en dos pasos, para dar soporte a aplicaciones con arquitectura cliente-servidor. El framework también ofrece la funcionalidad de firma electrónica con el DNIe, una firma legalmente válida y que permite chequear también la integridad del mensaje firmado. También se soporta por el framework la comprobación de un certificado mediante el protocolo OCSP, funcionalidad que si bien no implica directamente al DNIe, sí que es importante en el marco de procesos que se ven involucrados en cualquier Infraestructura de Clave Pública. ABSTRACT The project is a development API for DNIe card that allows easily create applications whose functionality is supported in the use of DNIe. Thus, the framework provides access to the main operations supported by the DNIe by invoking simple methods. One of the features is to perform an authentication process with the DNIe using its chip’s capabilities and authentication certificate. This functionality can also be accessed so divided into two steps, to support applications with client-server architecture. The framework also provides the functionality of electronic signatures with DNIe, a legally valid signature and allows also check the integrity of the signed message. Verification of a certificate using OCSP, functionality but does not imply directly to DNIe is also supported by the framework, yes it is important in the context of processes that are involved in any Public Key Infrastructure.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

El uso de aritmética de punto fijo es una opción de diseño muy extendida en sistemas con fuertes restricciones de área, consumo o rendimiento. Para producir implementaciones donde los costes se minimicen sin impactar negativamente en la precisión de los resultados debemos llevar a cabo una asignación cuidadosa de anchuras de palabra. Encontrar la combinación óptima de anchuras de palabra en coma fija para un sistema dado es un problema combinatorio NP-hard al que los diseñadores dedican entre el 25 y el 50 % del ciclo de diseño. Las plataformas hardware reconfigurables, como son las FPGAs, también se benefician de las ventajas que ofrece la aritmética de coma fija, ya que éstas compensan las frecuencias de reloj más bajas y el uso más ineficiente del hardware que hacen estas plataformas respecto a los ASICs. A medida que las FPGAs se popularizan para su uso en computación científica los diseños aumentan de tamaño y complejidad hasta llegar al punto en que no pueden ser manejados eficientemente por las técnicas actuales de modelado de señal y ruido de cuantificación y de optimización de anchura de palabra. En esta Tesis Doctoral exploramos distintos aspectos del problema de la cuantificación y presentamos nuevas metodologías para cada uno de ellos: Las técnicas basadas en extensiones de intervalos han permitido obtener modelos de propagación de señal y ruido de cuantificación muy precisos en sistemas con operaciones no lineales. Nosotros llevamos esta aproximación un paso más allá introduciendo elementos de Multi-Element Generalized Polynomial Chaos (ME-gPC) y combinándolos con una técnica moderna basada en Modified Affine Arithmetic (MAA) estadístico para así modelar sistemas que contienen estructuras de control de flujo. Nuestra metodología genera los distintos caminos de ejecución automáticamente, determina las regiones del dominio de entrada que ejercitarán cada uno de ellos y extrae los momentos estadísticos del sistema a partir de dichas soluciones parciales. Utilizamos esta técnica para estimar tanto el rango dinámico como el ruido de redondeo en sistemas con las ya mencionadas estructuras de control de flujo y mostramos la precisión de nuestra aproximación, que en determinados casos de uso con operadores no lineales llega a tener tan solo una desviación del 0.04% con respecto a los valores de referencia obtenidos mediante simulación. Un inconveniente conocido de las técnicas basadas en extensiones de intervalos es la explosión combinacional de términos a medida que el tamaño de los sistemas a estudiar crece, lo cual conlleva problemas de escalabilidad. Para afrontar este problema presen tamos una técnica de inyección de ruidos agrupados que hace grupos con las señales del sistema, introduce las fuentes de ruido para cada uno de los grupos por separado y finalmente combina los resultados de cada uno de ellos. De esta forma, el número de fuentes de ruido queda controlado en cada momento y, debido a ello, la explosión combinatoria se minimiza. También presentamos un algoritmo de particionado multi-vía destinado a minimizar la desviación de los resultados a causa de la pérdida de correlación entre términos de ruido con el objetivo de mantener los resultados tan precisos como sea posible. La presente Tesis Doctoral también aborda el desarrollo de metodologías de optimización de anchura de palabra basadas en simulaciones de Monte-Cario que se ejecuten en tiempos razonables. Para ello presentamos dos nuevas técnicas que exploran la reducción del tiempo de ejecución desde distintos ángulos: En primer lugar, el método interpolativo aplica un interpolador sencillo pero preciso para estimar la sensibilidad de cada señal, y que es usado después durante la etapa de optimización. En segundo lugar, el método incremental gira en torno al hecho de que, aunque es estrictamente necesario mantener un intervalo de confianza dado para los resultados finales de nuestra búsqueda, podemos emplear niveles de confianza más relajados, lo cual deriva en un menor número de pruebas por simulación, en las etapas iniciales de la búsqueda, cuando todavía estamos lejos de las soluciones optimizadas. Mediante estas dos aproximaciones demostramos que podemos acelerar el tiempo de ejecución de los algoritmos clásicos de búsqueda voraz en factores de hasta x240 para problemas de tamaño pequeño/mediano. Finalmente, este libro presenta HOPLITE, una infraestructura de cuantificación automatizada, flexible y modular que incluye la implementación de las técnicas anteriores y se proporciona de forma pública. Su objetivo es ofrecer a desabolladores e investigadores un entorno común para prototipar y verificar nuevas metodologías de cuantificación de forma sencilla. Describimos el flujo de trabajo, justificamos las decisiones de diseño tomadas, explicamos su API pública y hacemos una demostración paso a paso de su funcionamiento. Además mostramos, a través de un ejemplo sencillo, la forma en que conectar nuevas extensiones a la herramienta con las interfaces ya existentes para poder así expandir y mejorar las capacidades de HOPLITE. ABSTRACT Using fixed-point arithmetic is one of the most common design choices for systems where area, power or throughput are heavily constrained. In order to produce implementations where the cost is minimized without negatively impacting the accuracy of the results, a careful assignment of word-lengths is required. The problem of finding the optimal combination of fixed-point word-lengths for a given system is a combinatorial NP-hard problem to which developers devote between 25 and 50% of the design-cycle time. Reconfigurable hardware platforms such as FPGAs also benefit of the advantages of fixed-point arithmetic, as it compensates for the slower clock frequencies and less efficient area utilization of the hardware platform with respect to ASICs. As FPGAs become commonly used for scientific computation, designs constantly grow larger and more complex, up to the point where they cannot be handled efficiently by current signal and quantization noise modelling and word-length optimization methodologies. In this Ph.D. Thesis we explore different aspects of the quantization problem and we present new methodologies for each of them: The techniques based on extensions of intervals have allowed to obtain accurate models of the signal and quantization noise propagation in systems with non-linear operations. We take this approach a step further by introducing elements of MultiElement Generalized Polynomial Chaos (ME-gPC) and combining them with an stateof- the-art Statistical Modified Affine Arithmetic (MAA) based methodology in order to model systems that contain control-flow structures. Our methodology produces the different execution paths automatically, determines the regions of the input domain that will exercise them, and extracts the system statistical moments from the partial results. We use this technique to estimate both the dynamic range and the round-off noise in systems with the aforementioned control-flow structures. We show the good accuracy of our approach, which in some case studies with non-linear operators shows a 0.04 % deviation respect to the simulation-based reference values. A known drawback of the techniques based on extensions of intervals is the combinatorial explosion of terms as the size of the targeted systems grows, which leads to scalability problems. To address this issue we present a clustered noise injection technique that groups the signals in the system, introduces the noise terms in each group independently and then combines the results at the end. In this way, the number of noise sources in the system at a given time is controlled and, because of this, the combinato rial explosion is minimized. We also present a multi-way partitioning algorithm aimed at minimizing the deviation of the results due to the loss of correlation between noise terms, in order to keep the results as accurate as possible. This Ph.D. Thesis also covers the development of methodologies for word-length optimization based on Monte-Carlo simulations in reasonable times. We do so by presenting two novel techniques that explore the reduction of the execution times approaching the problem in two different ways: First, the interpolative method applies a simple but precise interpolator to estimate the sensitivity of each signal, which is later used to guide the optimization effort. Second, the incremental method revolves on the fact that, although we strictly need to guarantee a certain confidence level in the simulations for the final results of the optimization process, we can do it with more relaxed levels, which in turn implies using a considerably smaller amount of samples, in the initial stages of the process, when we are still far from the optimized solution. Through these two approaches we demonstrate that the execution time of classical greedy techniques can be accelerated by factors of up to ×240 for small/medium sized problems. Finally, this book introduces HOPLITE, an automated, flexible and modular framework for quantization that includes the implementation of the previous techniques and is provided for public access. The aim is to offer a common ground for developers and researches for prototyping and verifying new techniques for system modelling and word-length optimization easily. We describe its work flow, justifying the taken design decisions, explain its public API and we do a step-by-step demonstration of its execution. We also show, through an example, the way new extensions to the flow should be connected to the existing interfaces in order to expand and improve the capabilities of HOPLITE.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Plant growth and development are regulated by interactions between the environment and endogenous developmental programs. Of the various environmental factors controlling plant development, light plays an especially important role, in photosynthesis, in seasonal and diurnal time sensing, and as a cue for altering developmental pattern. Recently, several laboratories have devised a variety of genetic screens using Arabidopsis thaliana to dissect the signal transduction pathways of the various photoreceptor systems. Genetic analysis demonstrates that light responses are not simply endpoints of linear signal transduction pathways but are the result of the integration of information from a variety of photoreceptors through a complex network of interacting signaling components. These signaling components include the red/far-red light receptors, phytochromes, at least one blue light receptor, and negative regulatory genes (DET, COP, and FUS) that act downstream from the photoreceptors in the nucleus. In addition, a steroid hormone, brassinolide, also plays a role in light-regulated development and gene expression in Arabidopsis. These molecular and genetic data are allowing us to construct models of the mechanisms by which light controls development and gene expression in Arabidopsis. In the future, this knowledge can be used as a framework for understanding how all land plants respond to changes in their environment.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

B cells with a rearranged heavy-chain variable region VHa allotype-encoding VH1 gene segment predominate throughout the life of normal rabbits and appear to be the source of the majority of serum immunoglobulins, which thus bear VHa allotypes. The functional role(s) of these VH framework region (FR) allotypic structures has not been defined. We show here that B cells expressing surface immunoglobulin with VHa2 allotypic specificities are preferentially expanded and positively selected in the appendix of young rabbits. By flow cytometry, a higher proportion of a2+ B cells were progressing through the cell cycle (S/G2/M) compared to a2- B cells, most of which were in the G1/G0 phase of the cell cycle. The majority of appendix B cells in dark zones of germinal centers of normal 6-week-old rabbits were proliferating and very little apoptosis were observed. In contrast, in 6-week-old VH-mutant ali/ali rabbits, little cell proliferation and extensive apoptosis were observed. Nonetheless even in the absence of VH1, B cells with a2-like surface immunoglobulin had developed and expanded in the appendix of 11-week-old mutants. The numbers and tissue localization of B cells undergoing apoptosis then appeared similar to those found in 6-week-old normal appendix. Thus, B cells with immunoglobulin receptors lacking the VHa2 allotypic structures were less likely to undergo clonal expansion and maturation. These data suggest that "positive" selection of B lymphocytes through FR1 and FR3 VHa allotypic structures occurs during their development in the appendix.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This paper explores the gap in the literature between what is herein referred to as the "first psychotherapy case" and its impact on the development of the trainee psychotherapist's professional self. The self psychology concepts of identity development, selfobject needs and fulfillment, narcissism, shame, countertransference, and structuralization are incorporated into the theoretical framework from which this developmental milestone is viewed. The theory's emphasis on early experiences and the development of self highlight the distinctiveness of the first case for the therapist. The beginning psychotherapy case poses a unique context for selfobject experiences and the developing self, involving both the therapist's presumably mature needs (assuming an existing cohesive nuclear self) and more infantile needs as the professional, peripheral self develops. As a result, the potential and important implications for the psychotherapist, the patient, training implications for the supervisor, and the ensuing treatment through termination are identified. The intent is to shed light on an area that is understudied thus far, and to begin a conversation as to why and how the impact of the first case on the psychotherapist should be examined. Implications, limitations, and ideas for future exploratory and qualitative research are also discussed.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Knowledge has adopted a preferential role in the explanation of development while the evidence about the effect of natural resources in countries’ performance is more controversial in the economic literature. This paper tries to demonstrate that natural resources may positively affect growth in countries with a strong natural resources specialization pattern although the magnitude of these effects depend on the type of resources and on other aspects related to the production and innovation systems. The positive trajectory described by a set of national economies mainly specialized in natural resources and low-tech industries invites us to analyze what is the combination of factors that serves as engine for a sustainable development process. With panel data for the period 1996-2008 we estimate an applied growth model where both traditional factors and other more related to innovation and absorptive capabilities are taken into account. Our empirical findings show that according to the postulates of a knowledge-based approach, a framework that combines physical and intangible factors is more suitable for the definition of development strategies in those prosperous economies dominated by natural resources and connected activities, while the internationalization process of activities and technologies become also a very relevant aspect.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

When conceptualizing healthy couple relationships, it is tempting to use a simple framework as a panacea. Unfortunately, this desire for simplicity can lead to a narrow and naive perspective. Individuals interact and are influenced by a variety of factors (i.e., various social systems, multiple context memberships, complex interconnecting exchanges, etc.); consequently, it is necessary to guard against an overly narrow interpretation when examining healthy couple interactions. It is the purpose of this paper to develop one aspect of a complex perspective for healthy couple relationships by comparing couple life cycle development with couple intimacy-distance regulation.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The present thesis is focused on the development of a thorough mathematical modelling and computational solution framework aimed at the numerical simulation of journal and sliding bearing systems operating under a wide range of lubrication regimes (mixed, elastohydrodynamic and full film lubrication regimes) and working conditions (static, quasi-static and transient conditions). The fluid flow effects have been considered in terms of the Isothermal Generalized Equation of the Mechanics of the Viscous Thin Films (Reynolds equation), along with the massconserving p-Ø Elrod-Adams cavitation model that accordingly ensures the so-called JFO complementary boundary conditions for fluid film rupture. The variation of the lubricant rheological properties due to the viscous-pressure (Barus and Roelands equations), viscous-shear-thinning (Eyring and Carreau-Yasuda equations) and density-pressure (Dowson-Higginson equation) relationships have also been taken into account in the overall modelling. Generic models have been derived for the aforementioned bearing components in order to enable their applications in general multibody dynamic systems (MDS), and by including the effects of angular misalignments, superficial geometric defects (form/waviness deviations, EHL deformations, etc.) and axial motion. The bearing exibility (conformal EHL) has been incorporated by means of FEM model reduction (or condensation) techniques. The macroscopic in fluence of the mixedlubrication phenomena have been included into the modelling by the stochastic Patir and Cheng average ow model and the Greenwood-Williamson/Greenwood-Tripp formulations for rough contacts. Furthermore, a deterministic mixed-lubrication model with inter-asperity cavitation has also been proposed for full-scale simulations in the microscopic (roughness) level. According to the extensive mathematical modelling background established, three significant contributions have been accomplished. Firstly, a general numerical solution for the Reynolds lubrication equation with the mass-conserving p - Ø cavitation model has been developed based on the hybridtype Element-Based Finite Volume Method (EbFVM). This new solution scheme allows solving lubrication problems with complex geometries to be discretized by unstructured grids. The numerical method was validated in agreement with several example cases from the literature, and further used in numerical experiments to explore its exibility in coping with irregular meshes for reducing the number of nodes required in the solution of textured sliding bearings. Secondly, novel robust partitioned techniques, namely: Fixed Point Gauss-Seidel Method (PGMF), Point Gauss-Seidel Method with Aitken Acceleration (PGMA) and Interface Quasi-Newton Method with Inverse Jacobian from Least-Squares approximation (IQN-ILS), commonly adopted for solving uid-structure interaction problems have been introduced in the context of tribological simulations, particularly for the coupled calculation of dynamic conformal EHL contacts. The performance of such partitioned methods was evaluated according to simulations of dynamically loaded connecting-rod big-end bearings of both heavy-duty and high-speed engines. Finally, the proposed deterministic mixed-lubrication modelling was applied to investigate the in fluence of the cylinder liner wear after a 100h dynamometer engine test on the hydrodynamic pressure generation and friction of Twin-Land Oil Control Rings.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Abundant research has shown that poverty has negative influences on young child academic and psychosocial development, and unfortunately, disparities in school readiness between low and high income children can be seen as early the first year of life. The largest federal early care and education intervention for these vulnerable children is Early Head Start (EHS). To diminish these disparate child outcomes, EHS seeks to provide community based flexible programming for infants and toddlers and their families. Given how relatively recent these programs have been offered, little is known about the nuances of how EHS impacts infant and toddler language and psychosocial development. Using a framework of Community Based Participatory Research (CBPR) this paper had 5 goals: 1) to characterize the associations between domain specific and cumulative risk and child outcomes 2) to validate and explore these risk-outcome associations separately for Children of Hispanic immigrants (COHIs), 3) to explore relationships among family characteristics, multiple environmental factors, and dosage patterns in different EHS program types, 4) to examine the relationship between EHS dosage and child outcomes, and 5) to examine how EHS compliance impacts child internalizing and externalizing behaviors and emerging language abilities. Results of the current study showed that risks were differentially related to child outcomes. Poor maternal mental health was related to child internalizing and externalizing behaviors, but not related to emerging child language skills. Although child language skills were not related to maternal mental health, they were related to economic hardship. Additionally, parent level Spanish use and heritage orientation were associated with positive child outcomes. Results also showed that these relationships differed when COHIs and children with native-born parents were examined separately. Further, unique patterns emerged for EHS program use, for example families who participated in home-based care were less likely to comply with EHS attendance requirements. These findings provide tangible suggestions for EHS stakeholders: namely, the need to develop effective programming that targets engagement for diverse families enrolled in EHS programs.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This testimony discusses proposed legislation to amend the definition of accredited investor. It also discusses proposed legislation designed to reform the regulatory framework for business development companies. Among other things, the regulatory regime for BDCs would change to allow these companies to invest a greater portion of their assets in financial companies, potentially reducing the percentage of assets invested in operating companies.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Sustainable development (or sustainability) is a decision-making framework for maintaining and achieving human well-being, both in the present and into the future. The framework requires both consideration and achievement of environmental protection, social justice and economic development. In that framework, environmental protection must be integrated into decisions about social and economic development, and social justice and economic viability must be integrated into decisions about environmental quality. First endorsed by the world’s nations in 1992, this framework is intended to provide an effective response to the twin global challenges of growing environmental degradation and widespread extreme poverty. Sustainability provides a framework for humans to live in harmony with nature, rather than at nature’s expense. It may therefore be one of the most important ideas to come out of the 20th century. In the last two decades, the framework has become a touchstone in nearly every economic sector and at every level of government, unleashing an extraordinary range of creativity in all of those realms. Sustainable development is having a significant effect on the practice of law and on the way in which laws are written and implemented. Understanding the framework is increasingly important for law makers and lawyers. As sustainable development (or sustainability) has grown in prominence, its critics have become more numerous and more vocal. Three major lines of criticism are that the term is “too boring” to command public attention, “too vague” to provide guidance, and “too late” to address the world’s problems. Critics suggest goals such as abundance, environmental integrity, and resilience. Beginning with the international agreements that shaped the concept of sustainable development, this Article provides a functional and historical analysis of the meaning of sustainable development. It then analyzes and responds to each of these criticisms in turn. While the critics, understood constructively, suggest ways of strengthening this framework, they do not provide a compelling alternative. The challenge for lawyers, law makers, and others is to use and improve this framework to make better decisions.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

As world communication, technology, and trade become increasingly integrated through globalization, multinational corporations seek employees with global leadership experience and skills. However, the demand for these skills currently outweighs the supply. Given the rarity of globally ready leaders, global competency development should be emphasized in higher education programs. The reality, however, is that university graduate programs are often outdated and focus mostly on cognitive learning. Global leadership competence requires moving beyond the cognitive domain of learning to create socially responsible and culturally connected global leaders. This requires attention to development methods; however, limited research in global leadership development methods has been conducted. A new conceptual model, the global leadership development ecosystem, was introduced in this study to guide the design and evaluation of global leadership development programs. It was based on three theories of learning and was divided into four development methodologies. This study quantitatively tested the model and used it as a framework for an in-depth examination of the design of one International MBA program. The program was first benchmarked, by means of a qualitative best practices analysis, against the top-ranking IMBA programs in the world. Qualitative data from students, faculty, administrators, and staff was then examined, using descriptive and focused data coding. Quantitative data analysis, using PASW Statistics software, and a hierarchical regression, showed the individual effect of each of the four development methods, as well as their combined effect, on student scores on a global leadership assessment. The analysis revealed that each methodology played a distinct and important role in developing different competencies of global leadership. It also confirmed the critical link between self-efficacy and global leadership development.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Data mining is one of the most important analysis techniques to automatically extract knowledge from large amount of data. Nowadays, data mining is based on low-level specifications of the employed techniques typically bounded to a specific analysis platform. Therefore, data mining lacks a modelling architecture that allows analysts to consider it as a truly software-engineering process. Bearing in mind this situation, we propose a model-driven approach which is based on (i) a conceptual modelling framework for data mining, and (ii) a set of model transformations to automatically generate both the data under analysis (that is deployed via data-warehousing technology) and the analysis models for data mining (tailored to a specific platform). Thus, analysts can concentrate on understanding the analysis problem via conceptual data-mining models instead of wasting efforts on low-level programming tasks related to the underlying-platform technical details. These time consuming tasks are now entrusted to the model-transformations scaffolding. The feasibility of our approach is shown by means of a hypothetical data-mining scenario where a time series analysis is required.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Geographic knowledge discovery (GKD) is the process of extracting information and knowledge from massive georeferenced databases. Usually the process is accomplished by two different systems, the Geographic Information Systems (GIS) and the data mining engines. However, the development of those systems is a complex task due to it does not follow a systematic, integrated and standard methodology. To overcome these pitfalls, in this paper, we propose a modeling framework that addresses the development of the different parts of a multilayer GKD process. The main advantages of our framework are that: (i) it reduces the design effort, (ii) it improves quality systems obtained, (iii) it is independent of platforms, (iv) it facilitates the use of data mining techniques on geo-referenced data, and finally, (v) it ameliorates the communication between different users.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This article presents an interactive Java software platform which enables any user to easily create advanced virtual laboratories (VLs) for Robotics. This novel tool provides both support for developing applications with full 3D interactive graphical interface and a complete functional framework for modelling and simulation of arbitrary serial-link manipulators. In addition, its software architecture contains a high number of functionalities included as high-level tools, with the advantage of allowing any user to easily develop complex interactive robotic simulations with a minimum of programming. In order to show the features of the platform, the article describes, step-by-step, the implementation methodology of a complete VL for Robotics education using the presented approach. Finally, some educational results about the experience of implementing this approach are reported.