195 resultados para Devising
Resumo:
A 3-year Project started on November 1 2010, financed by the European Commision within the FP-7 Space Program, and aimed at developing an efficient de-orbit system that could be carried on board by future spacecraft launched into LEO, will be presented. The operational system will deploy a thin uninsulated tape-tether to collect electrons as a giant Langmuir probe, using no propellant/no power supply, and generating power on board. This project will involve free-fall tests, and laboratory hypervelocity-impact and tether-current tests, and design/Manufacturing of subsystems: interface elements, electric control and driving module, electron-ejecting plasma contactor, tether-deployment mechanism/end-mass, and tape samples. Preliminary results to be presented involve: i) devising criteria for sizing the three disparate tape dimensions, affecting mass, resistance, current-collection, magnetic self-field, and survivability against debris itself; ii) assessing the dynamical relevance of tether parameters in implementing control laws to limit oscillations in /off the orbital plane, where passive stability may be marginal; iii) deriving a law for bare-tape current from numerical simulations and chamber tests, taking into account ambient magnetic field, ion ram motion, and adiabatic electron trapping; iv) determining requirements on a year-dormant hollow cathode under long times/broad emission-range operation, and trading-off against use of electron thermal emission; v) determining requirements on magnetic components and power semiconductors for a control module that faces high voltage/power operation under mass/volume limitations; vi) assessing strategies to passively deploy a wide conductive tape that needs no retrieval, while avoiding jamming and ending at minimum libration; vii) evaluating the tape structure as regards conductive and dielectric materials, both lengthwise and in its cross-section, in particular to prevent arcing in triple-point junctions.
Resumo:
This article describes the design of a linear observer–linear controller-based robust output feedback scheme for output reference trajectory tracking tasks in the case of nonlinear, multivariable, nonholonomic underactuated mobile manipulators. The proposed linear feedback scheme is based on the use of a classical linear feedback controller and suitably extended, high-gain, linear Generalized Proportional Integral (GPI) observers, thus aiding the linear feedback controllers to provide an accurate simultaneous estimation of each flat output associated phase variables and of the exogenous and perturbation inputs. This information is used in the proposed feedback controller in (a) approximate, yet close, cancelations, as lumped unstructured time-varying terms, of the influence of the highly coupled nonlinearities, and (b) the devising of proper linear output feedback control laws based on the approximate estimates of the string of phase variables associated with the flat outputs simultaneously provided by the disturbance observers. Simulations reveal the effectiveness of the proposed approach.
Resumo:
Los sistemas empotrados son cada día más comunes y complejos, de modo que encontrar procesos seguros, eficaces y baratos de desarrollo software dirigidos específicamente a esta clase de sistemas es más necesario que nunca. A diferencia de lo que ocurría hasta hace poco, en la actualidad los avances tecnológicos en el campo de los microprocesadores de los últimos tiempos permiten el desarrollo de equipos con prestaciones más que suficientes para ejecutar varios sistemas software en una única máquina. Además, hay sistemas empotrados con requisitos de seguridad (safety) de cuyo correcto funcionamiento depende la vida de muchas personas y/o grandes inversiones económicas. Estos sistemas software se diseñan e implementan de acuerdo con unos estándares de desarrollo software muy estrictos y exigentes. En algunos casos puede ser necesaria también la certificación del software. Para estos casos, los sistemas con criticidades mixtas pueden ser una alternativa muy valiosa. En esta clase de sistemas, aplicaciones con diferentes niveles de criticidad se ejecutan en el mismo computador. Sin embargo, a menudo es necesario certificar el sistema entero con el nivel de criticidad de la aplicación más crítica, lo que hace que los costes se disparen. La virtualización se ha postulado como una tecnología muy interesante para contener esos costes. Esta tecnología permite que un conjunto de máquinas virtuales o particiones ejecuten las aplicaciones con unos niveles de aislamiento tanto temporal como espacial muy altos. Esto, a su vez, permite que cada partición pueda ser certificada independientemente. Para el desarrollo de sistemas particionados con criticidades mixtas se necesita actualizar los modelos de desarrollo software tradicionales, pues estos no cubren ni las nuevas actividades ni los nuevos roles que se requieren en el desarrollo de estos sistemas. Por ejemplo, el integrador del sistema debe definir las particiones o el desarrollador de aplicaciones debe tener en cuenta las características de la partición donde su aplicación va a ejecutar. Tradicionalmente, en el desarrollo de sistemas empotrados, el modelo en V ha tenido una especial relevancia. Por ello, este modelo ha sido adaptado para tener en cuenta escenarios tales como el desarrollo en paralelo de aplicaciones o la incorporación de una nueva partición a un sistema ya existente. El objetivo de esta tesis doctoral es mejorar la tecnología actual de desarrollo de sistemas particionados con criticidades mixtas. Para ello, se ha diseñado e implementado un entorno dirigido específicamente a facilitar y mejorar los procesos de desarrollo de esta clase de sistemas. En concreto, se ha creado un algoritmo que genera el particionado del sistema automáticamente. En el entorno de desarrollo propuesto, se han integrado todas las actividades necesarias para desarrollo de un sistema particionado, incluidos los nuevos roles y actividades mencionados anteriormente. Además, el diseño del entorno de desarrollo se ha basado en la ingeniería guiada por modelos (Model-Driven Engineering), la cual promueve el uso de los modelos como elementos fundamentales en el proceso de desarrollo. Así pues, se proporcionan las herramientas necesarias para modelar y particionar el sistema, así como para validar los resultados y generar los artefactos necesarios para el compilado, construcción y despliegue del mismo. Además, en el diseño del entorno de desarrollo, la extensión e integración del mismo con herramientas de validación ha sido un factor clave. En concreto, se pueden incorporar al entorno de desarrollo nuevos requisitos no-funcionales, la generación de nuevos artefactos tales como documentación o diferentes lenguajes de programación, etc. Una parte clave del entorno de desarrollo es el algoritmo de particionado. Este algoritmo se ha diseñado para ser independiente de los requisitos de las aplicaciones así como para permitir al integrador del sistema implementar nuevos requisitos del sistema. Para lograr esta independencia, se han definido las restricciones al particionado. El algoritmo garantiza que dichas restricciones se cumplirán en el sistema particionado que resulte de su ejecución. Las restricciones al particionado se han diseñado con una capacidad expresiva suficiente para que, con un pequeño grupo de ellas, se puedan expresar la mayor parte de los requisitos no-funcionales más comunes. Las restricciones pueden ser definidas manualmente por el integrador del sistema o bien pueden ser generadas automáticamente por una herramienta a partir de los requisitos funcionales y no-funcionales de una aplicación. El algoritmo de particionado toma como entradas los modelos y las restricciones al particionado del sistema. Tras la ejecución y como resultado, se genera un modelo de despliegue en el que se definen las particiones que son necesarias para el particionado del sistema. A su vez, cada partición define qué aplicaciones deben ejecutar en ella así como los recursos que necesita la partición para ejecutar correctamente. El problema del particionado y las restricciones al particionado se modelan matemáticamente a través de grafos coloreados. En dichos grafos, un coloreado propio de los vértices representa un particionado del sistema correcto. El algoritmo se ha diseñado también para que, si es necesario, sea posible obtener particionados alternativos al inicialmente propuesto. El entorno de desarrollo, incluyendo el algoritmo de particionado, se ha probado con éxito en dos casos de uso industriales: el satélite UPMSat-2 y un demostrador del sistema de control de una turbina eólica. Además, el algoritmo se ha validado mediante la ejecución de numerosos escenarios sintéticos, incluyendo algunos muy complejos, de más de 500 aplicaciones. ABSTRACT The importance of embedded software is growing as it is required for a large number of systems. Devising cheap, efficient and reliable development processes for embedded systems is thus a notable challenge nowadays. Computer processing power is continuously increasing, and as a result, it is currently possible to integrate complex systems in a single processor, which was not feasible a few years ago.Embedded systems may have safety critical requirements. Its failure may result in personal or substantial economical loss. The development of these systems requires stringent development processes that are usually defined by suitable standards. In some cases their certification is also necessary. This scenario fosters the use of mixed-criticality systems in which applications of different criticality levels must coexist in a single system. In these cases, it is usually necessary to certify the whole system, including non-critical applications, which is costly. Virtualization emerges as an enabling technology used for dealing with this problem. The system is structured as a set of partitions, or virtual machines, that can be executed with temporal and spatial isolation. In this way, applications can be developed and certified independently. The development of MCPS (Mixed-Criticality Partitioned Systems) requires additional roles and activities that traditional systems do not require. The system integrator has to define system partitions. Application development has to consider the characteristics of the partition to which it is allocated. In addition, traditional software process models have to be adapted to this scenario. The V-model is commonly used in embedded systems development. It can be adapted to the development of MCPS by enabling the parallel development of applications or adding an additional partition to an existing system. The objective of this PhD is to improve the available technology for MCPS development by providing a framework tailored to the development of this type of system and by defining a flexible and efficient algorithm for automatically generating system partitionings. The goal of the framework is to integrate all the activities required for developing MCPS and to support the different roles involved in this process. The framework is based on MDE (Model-Driven Engineering), which emphasizes the use of models in the development process. The framework provides basic means for modeling the system, generating system partitions, validating the system and generating final artifacts. The framework has been designed to facilitate its extension and the integration of external validation tools. In particular, it can be extended by adding support for additional non-functional requirements and support for final artifacts, such as new programming languages or additional documentation. The framework includes a novel partitioning algorithm. It has been designed to be independent of the types of applications requirements and also to enable the system integrator to tailor the partitioning to the specific requirements of a system. This independence is achieved by defining partitioning constraints that must be met by the resulting partitioning. They have sufficient expressive capacity to state the most common constraints and can be defined manually by the system integrator or generated automatically based on functional and non-functional requirements of the applications. The partitioning algorithm uses system models and partitioning constraints as its inputs. It generates a deployment model that is composed by a set of partitions. Each partition is in turn composed of a set of allocated applications and assigned resources. The partitioning problem, including applications and constraints, is modeled as a colored graph. A valid partitioning is a proper vertex coloring. A specially designed algorithm generates this coloring and is able to provide alternative partitions if required. The framework, including the partitioning algorithm, has been successfully used in the development of two industrial use cases: the UPMSat-2 satellite and the control system of a wind-power turbine. The partitioning algorithm has been successfully validated by using a large number of synthetic loads, including complex scenarios with more that 500 applications.
Resumo:
The purpose of this Project is, first and foremost, to disclose the topic of nonlinear vibrations and oscillations in mechanical systems and, namely, nonlinear normal modes NNMs to a greater audience of researchers and technicians. To do so, first of all, the dynamical behavior and properties of nonlinear mechanical systems is outlined from the analysis of a pair of exemplary models with the harmonic balanced method. The conclusions drawn are contrasted with the Linear Vibration Theory. Then, it is argued how the nonlinear normal modes could, in spite of their limitations, predict the frequency response of a mechanical system. After discussing those introductory concepts, I present a Matlab package called 'NNMcont' developed by a group of researchers from the University of Liege. This package allows the analysis of nonlinear normal modes of vibration in a range of mechanical systems as extensions of the linear modes. This package relies on numerical methods and a 'continuation algorithm' for the computation of the nonlinear normal modes of a conservative mechanical system. In order to prove its functionality, a two degrees of freedom mechanical system with elastic nonlinearities is analized. This model comprises a mass suspended on a foundation by means of a spring-viscous damper mechanism -analogous to a very simplified model of most suspended structures and machines- that has attached a mass damper as a passive vibration control system. The results of the computation are displayed on frequency energy plots showing the NNMs branches along with modal curves and time-series plots for each normal mode. Finally, a critical analysis of the results obtained is carried out with an eye on devising what they can tell the researcher about the dynamical properties of the system.
Resumo:
A cross-sectional survey was made in 56 exceptionally healthy males, ranging in age from 20 to 84 years. Measurements were made of selected steroidal components and peptidic hormones in blood serum, and cognitive and physical tests were performed. Of those blood serum variables that gave highly significant negative correlations with age (r > −0.6), bioavailable testosterone (BT), dehydroepiandrosterone sulfate (DHEAS), and the ratio of insulin-like growth factor 1 (IGF-1) to growth hormone (GH) showed a stepwise pattern of age-related changes most closely resembling those of the age steps themselves. Of these, BT correlated best with significantly age-correlated cognitive and physical measures. Because DHEAS correlated well with BT and considerably less well than BT with the cognitive and physical measures, it seems likely that BT and/or substances to which BT gives rise in tissues play a more direct role in whatever processes are rate-limiting in the functions measured and that DHEAS relates more indirectly to these functions. The high correlation of IGF-1/GH with age, its relatively low correlation with BT, and the patterns of correlations of IGF-1/GH and BT with significantly age-correlated cognitive and physical measures suggest that the GH–IGF-1 axis and BT play independent roles in affecting these functions. Serial determinations made after oral ingestion of pregnenolone and data from the literature suggest there is interdependence of steroid metabolic systems with those operational in control of interrelations in the GH–IGF-1 axis. Longitudinal concurrent measurements of serum levels of BT, DHEAS, and IGF-1/GH together with detailed studies of their correlations with age-correlated functional measures may be useful in detecting early age-related dysregulations and may be helpful in devising ameliorative approaches.
Resumo:
The adaptation of the Spanish University to the European Higher Education Area (EEES in Spanish) demands the integration of new tools and skills that would make the teaching- learning process easier. This adaptation involves a change in the evaluation methods, which goes from a system where the student was evaluated with a final exam, to a new system where we include a continuous evaluation in which the final exam may represent at most 50% in the vast majority of the Universities. Devising a new and fair continuous evaluation system is not an easy task to do. That would mean a student’s’ learning process follow-up by the teachers, and as a consequence an additional workload on existing staff resources. Traditionally, the continuous evaluation is associated with the daily work of the student and a collection of the different marks partly or entirely based on the work they do during the academic year. Now, small groups of students and an attendance control are important aspects to take into account in order to get an adequate assessment of the students. However, most of the university degrees have groups with more than 70 students, and the attendance control is a complicated task to perform, mostly because it consumes significant amounts of staff time. Another problem found is that the attendance control would encourage not-interested students to be present at class, which might cause some troubles to their classmates. After a two year experience in the development of a continuous assessment in Statistics subjects in Social Science degrees, we think that individual and periodical tasks are the best way to assess results. These tasks or examinations must be done in classroom during regular lessons, so we need an efficient system to put together different and personal questions in order to prevent students from cheating. In this paper we provide an efficient and effective way to elaborate random examination papers by using Sweave, a tool that generates data, graphics and statistical calculus from the software R and shows results in PDF documents created by Latex. In this way, we will be able to design an exam template which could be compiled in order to generate as many PDF documents as it is required, and at the same time, solutions are provided to easily correct them.
Resumo:
This paper assesses the uses and misuses in the application of the European Arrest Warrant (EAW) system in the European Union. It examines the main quantitative results of this extradition system achieved between 2005 and 2011 on the basis of the existing statistical knowledge on its implementation at EU official levels. The EAW has been anchored in a high level of ‘mutual trust’ between the participating states’ criminal justice regimes and authorities. This reciprocal confidence, however, has been subject to an increasing number of challenges resulting from its practical application, presenting a dual conundrum: 1. Principle of proportionality: Who are the competent judicial authorities cooperating with each other and ensuring that there are sufficient impartial controls over the necessity and proportionality of the decisions on the issuing and execution of EAWs? 2. Principle of division of powers: How can criminal justice authorities be expected to handle different criminal judicial traditions in what is supposed to constitute a ‘serious’ or ‘minor’ crime in their respective legal settings and ‘who’ is ultimately to determine (divorced from political considerations) when is it duly justified to make the EAW system operational? It is argued that the next generation of the EU’s criminal justice cooperation and the EAW need to recognise and acknowledge that the mutual trust premise upon which the European system has been built so far is no longer viable without devising new EU policy stakeholders’ structures and evaluation mechanisms. These should allow for the recalibration of mutual trust and mistrust in EU justice systems in light of the experiences of the criminal justice actors and practitioners having a stake in putting the EAW into daily effect. Such a ‘bottom-up approach’ should be backed up with the best impartial and objective evaluation, an improved system of statistical collection and an independent qualitative assessment of its implementation. This should be placed as the central axis of a renewed EAW framework which should seek to better ensure the accountability, impartial (EU-led) scrutiny and transparency of member states’ application of the EAW in light of the general principles and fundamental rights constituting the foundations of the European system of criminal justice cooperation.
Resumo:
The energy sector, especially with regard to natural gas trade, is one of the key areas of co-operation between the EU and Russia. However, the character of this co-operation has given rise to increasing doubts both in Brussels and among the EU member states. The questions have emerged whether this co-operation does not make the EU excessively dependent on Russian energy supplies, and whether Gazprom's presence in the EU will not allow Moscow to interfere in the proces of devising the EU energy policy. This report is intended to present the factual base and data necessary to provide accurate answers to the foregoing questions. The first part of the report presents the scope and character of Gazprom's economic presence in the EU member states. The second part shows the presence of the EU investors in Russia. The data presented has been provided by the International Energy Agency, European Commission, the Central Bank of Russia and the Russian Federal State Statistics Service. Some of the data is the result of calculations made by the Centre for Eastern Studies' experts who were basing on the data provided by energy companies, the specialist press and news agencies.
Resumo:
This paper aims at devising scenarios for the development of the financial system in the southern and eastern Mediterranean countries (SEMCs), for the 2030 horizon. The results of our simulations indicate that bank credit to the private sector, meta-efficiency and stock market turnover could reach at best 108%, 78% and 121%, respectively, if the SEMCs adopt the best practices in Europe. These scenarios are much higher than those of the present levels in the region but still lower than the best performers in Europe. More specifically, we find that improving the quality of institutions, increasing per capita GDP, opening further capital account and lowering inflation are needed to enable the financial system in the region to converge with those of Europe.
Resumo:
Aims The aims of this study are to develop and validate a measure to screen for a range of gambling-related cognitions (GRC) in gamblers. Design and participants A total of 968 volunteers were recruited from a community-based population. They were divided randomly into two groups. Principal axis factoring with varimax rotation was performed on group one and confirmatory factor analysis (CFA) was used on group two to confirm the best-fitted solution. Measurements The Gambling Related Cognition Scale (GRCS) was developed for this study and the South Oaks Gambling Screen (SOGS), the Motivation Towards Gambling Scale (MTGS) and the Depression Anxiety Stress Scale (DASS-2 1) were used for validation. Findings Exploratory factor analysis performed using half the sample indicated five factors, which included interpretative control/bias (GRCS-IB), illusion of control (GRCS-IC), predictive control (GRCS-PC), gambling-related expectancies (GRCS-GE) and a perceived inability to stop gambling (GRCS-IS). These accounted for 70% of the total variance. Using the other half of the sample, CFA confirmed that the five-factor solution fitted the data most effectively. Cronbach's alpha coefficients for the factors ranged from 0.77 to 0.91, and 0.93 for the overall scale. Conclusions This paper demonstrated that the 23-item GRCS has good psychometric properties and thus is a useful instrument for identifying GRC among non-clinical gamblers. It provides the first step towards devising/adapting similar tools for problem gamblers as well as developing more specialized instruments to assess particular domains of GRC.
Resumo:
A significant proportion of the human population suffers from some form of skin disorder, whether it be from burn injury or inherited skin anomalies. The ideal treatment for skin disorders would be to regrow skin tissue from stem cells residing in the individual patient's skin. Locating these adult stem cells and elucidating the molecules involved in orchestrating the production of new skin cells are important steps in devising more-efficient methods of skin production and wound healing via the ex vivo expansion of patient keratinocytes in culture. This review focuses on the structure of the skin, the identification of skin stem cells, and the role of Notch, Wnt and Hedgehog signalling cascades in regulating the fate of epidermal stem cells. © 2005 Cambridge University Press.
Resumo:
Childhood obesity is becoming a topical issue in both the health literature and the popular media and increasingly child health nurses are observing preschool children who appear to be disproportionately heavy for their height when plotted on standardised growth charts. In this paper literature related to childhood obesity in New Zealand and internationally is explored to identify current issues, and the implications of these issues for nurses in community based child health practice are discussed. Themes that emerged from the literature relate to the measurement of obesity, links between childhood and adult obesity and issues for families. A theme in the literature around maternal perception was of particular interest. Studies that investigated maternal perceptions of childhood obesity found that mothers identified their child as being overweight or obese only when it imposed limitations on physical activity or when the children were teased rather than by referring to individual growth graphs. The implications for nursing in the area of child health practice is discussed as nurses working in this area need an understanding of the complex and often emotive issues surrounding childhood obesity and an awareness of the reality of people's lives when devising health promotion strategies.
Resumo:
Visualisation of multiple isoforms of kappa-casein on 2-D gels is restricted by the abundant alpha- and beta-caseins that not only limit gel loading but also migrate to similar regions as the more acidic kappa-casein isoforms. To overcome this problem, we took advantage of the absence of cysteine residues in alpha(S1)- and beta-casein by devising an affinity enrichment procedure based on reversible biotinylation of cysteine residues. Affinity capture of cysteine-containing proteins on avidin allowed the removal of the vast majority of alpha(S1)- and beta-casein, and on subsequent 2-D gel analysis 16 gel spots were identified as kappa-casein by PMF. Further analysis of the C-terminal tryptic peptide along with structural predictions based on mobility on the 2-D gel allowed us to assign identities to each spot in terms of genetic variant (A or B), phosphorylation status (1, 2 or 3) and glycosylation status (from 0 to 6). Eight isoforms of the A and B variants with the same PTMs were observed. When the casein fraction of milk from a single cow, homozygous for the B variant of kappa-casein, was used as the starting material, 17 isoforms from 13 gel spots were characterised. Analysis of isoforms of low abundance proved challenging due to the low amount of material that could be extracted from the gels as well as the lability of the PTMs during MS analysis. However, we were able to identify a previously unrecognised site, T-166, that could be phosphorylated or glycosylated. Despite many decades of analysis of milk proteins, the reasons for this high level of heterogeneity are still not clear.
Resumo:
The use of multiple partial viewpoints is recommended for specification. We believe they also can be useful for devising strategies for testing. In this paper, we use Object-Z to formally specify concurrent Java components from viewpoints based on the separation of application and synchronisation concerns inherent in Java monitors. We then use the Test-Template Framework on the Object-Z viewpoints to devise a strategy for testing the components. When combining the test templates for the different viewpoints we focus on the observable behaviour of the application to systematically derive a practical testing strategy. The Producer-Consumer and Readers-Writers problems are considered as case studies.
Resumo:
Edges are key points of information in visual scenes. One important class of models supposes that edges correspond to the steepest parts of the luminance profile, implying that they can be found as peaks and troughs in the response of a gradient (first-derivative) filter, or as zero-crossings (ZCs) in the second-derivative. A variety of multi-scale models are based on this idea. We tested this approach by devising a stimulus that has no local peaks of gradient and no ZCs, at any scale. Our stimulus profile is analogous to the classic Mach-band stimulus, but it is the local luminance gradient (not the absolute luminance) that increases as a linear ramp between two plateaux. The luminance profile is a smoothed triangle wave and is obtained by integrating the gradient profile. Subjects used a cursor to mark the position and polarity of perceived edges. For all the ramp-widths tested, observers marked edges at or close to the corner points in the gradient profile, even though these were not gradient maxima. These new Mach edges correspond to peaks and troughs in the third-derivative. They are analogous to Mach bands - light and dark bars are seen where there are no luminance peaks but there are peaks in the second derivative. Here, peaks in the third derivative were seen as light-to-dark edges, troughs as dark-to-light edges. Thus Mach edges are inconsistent with many standard edge detectors, but are nicely predicted by a new model that uses a (nonlinear) third-derivative operator to find edge points.