981 resultados para software management infrastructure


Relevância:

80.00% 80.00%

Publicador:

Resumo:

Low-power processors and accelerators that were originally designed for the embedded systems market are emerging as building blocks for servers. Power capping has been actively explored as a technique to reduce the energy footprint of high-performance processors. The opportunities and limitations of power capping on the new low-power processor and accelerator ecosystem are less understood. This paper presents an efficient power capping and management infrastructure for heterogeneous SoCs based on hybrid ARM/FPGA designs. The infrastructure coordinates dynamic voltage and frequency scaling with task allocation on a customised Linux system for the Xilinx Zynq SoC. We present a compiler-assisted power model to guide voltage and frequency scaling, in conjunction with workload allocation between the ARM cores and the FPGA, under given power caps. The model achieves less than 5% estimation bias to mean power consumption. In an FFT case study, the proposed power capping schemes achieve on average 97.5% of the performance of the optimal execution and match the optimal execution in 87.5% of the cases, while always meeting power constraints.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Diseño e implementación de modelo de datos que admite el inventario de la red de telecomunicaciones y su gestión desde sistemas de información geográfica. Incluye el desarrollo de los clientes e interficies con otras aplicaciones existentes y la integración con los procesos de trabajo. Se tienen en cuenta aspectos innovadores que permitan la retroalimentación del sistema por sus propios usuarios, admitiéndose soluciones basadas en software libre o en los procesos de desarrollo implantados en dicho tipo de software

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Background Appropriately conducted adaptive designs (ADs) offer many potential advantages over conventional trials. They make better use of accruing data, potentially saving time, trial participants, and limited resources compared to conventional, fixed sample size designs. However, one can argue that ADs are not implemented as often as they should be, particularly in publicly funded confirmatory trials. This study explored barriers, concerns, and potential facilitators to the appropriate use of ADs in confirmatory trials among key stakeholders. Methods We conducted three cross-sectional, online parallel surveys between November 2014 and January 2015. The surveys were based upon findings drawn from in-depth interviews of key research stakeholders, predominantly in the UK, and targeted Clinical Trials Units (CTUs), public funders, and private sector organisations. Response rates were as follows: 30(55 %) UK CTUs, 17(68 %) private sector, and 86(41 %) public funders. A Rating Scale Model was used to rank barriers and concerns in order of perceived importance for prioritisation. Results Top-ranked barriers included the lack of bridge funding accessible to UK CTUs to support the design of ADs, limited practical implementation knowledge, preference for traditional mainstream designs, difficulties in marketing ADs to key stakeholders, time constraints to support ADs relative to competing priorities, lack of applied training, and insufficient access to case studies of undertaken ADs to facilitate practical learning and successful implementation. Associated practical complexities and inadequate data management infrastructure to support ADs were reported as more pronounced in the private sector. For funders of public research, the inadequate description of the rationale, scope, and decision-making criteria to guide the planned AD in grant proposals by researchers were all viewed as major obstacles. Conclusions There are still persistent and important perceptions of individual and organisational obstacles hampering the use of ADs in confirmatory trials research. Stakeholder perceptions about barriers are largely consistent across sectors, with a few exceptions that reflect differences in organisations’ funding structures, experiences and characterisation of study interventions. Most barriers appear connected to a lack of practical implementation knowledge and applied training, and limited access to case studies to facilitate practical learning. Keywords: Adaptive designs; flexible designs; barriers; surveys; confirmatory trials; Phase 3; clinical trials; early stopping; interim analyses

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Forskningen visar att förändringar av informationsteknologin och en ökande anskaffning av nya programvaror har lett till underliggande problem som kan drabba heterogena programvarulicensmiljöer och stora organisationer. Underliggande problem i den stora kontexten är mjukvaruhantering. Licenshantering av programvaror är just en förgrening av det stora problemet. Stora organisationer som en kommunal verksamhet är drabbad av det här underliggande problemet på grund av komplexitet hos organisationens miljö. Att tillämpa förändringar i området programvarulicens är omöjligt utan att göra förändringar i hela den organisationsprocess som följer med det. Fallstudiens uppdrag är ett nytt omfattande område kring licenshantering av programvaror som kan vara väldigt lärorikt och en bra erfarenhet att ta del av. Uppsatsen beskriver hur en kommunal verksamhets licenshantering av programvaror ser ut och de problem som finns med den nuvarande licenshanteringsprocessen. Förarbetet med en litteraturstudie tillsammans med datagenereringsmetoderna intervjuer, dokumentstudier och observationer används för att studera fallet på djupet. Målet är att kunna ta fram de nuvarande problem som finns, analysera dem och ge rekommendation för åtgärder som det studerade fallobjektet, Falu Kommuns IT-kontor, kan använda. En rekommendation för en tydlig licenshanteringsprocessmodell anses vara ett bra akademiskt bidrag eftersom problemet med licenshanteringen av programvaror är ett generellt problem. Uppsatsens resultat är en processmodell om licenshantering av programvaror för organisationer med IT-tjänstkunder. Det är en generisk lösning som skulle kunna användas av andra kommunverksamheter och liknande organisationer.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

While contemporary Western planning traditions in Australia talk of the last 200 years of innovation and transposition of European and North American planning traditions upon the Australian landscape, they neglect to mention some 40-50,000 years of Indigenous landscape planning initiatives and practice. The ancestral country of the Gunditjmara people is in the Western District of Victoria focused upon the Lake Condah and Mount Eccles localities. The Gunditjmara had, and continue to have a strong social, cultural and land management and planning presence in the region, in particular linked to environmental engineering initiatives and aquaculture curatorship of eel and fish resources. Archaeological evidence confirms that some 10,000 years of pre-European contact landscape planning practice has been applied by the Gunditjmara to construct resources management infrastructure to service a regional food need as well as a community need. Within contemporary reconciliation discourses, the Gunditjmara have activity sought over the last 25 years the rehabilitation of Lake Condah, which is now coming into fruition, and the restoration of their traditional landscape planning and management responsibilities. This paper reviews the restoration of Indigenous landscape planning and management theory and practice by the Gunditjmara, pointing to significant policy and practice success as well as the need to better appreciate this culturally-attuned and ecologically-responsive approach to landscape planning borne out of generations of knowledge.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Sabe-se que o Brasil enfrenta um de seus maiores desafios no campo educacional. Projetos educacionais como o Programa Jovem de Futuro do Instituto Unibanco permitem uma investigação minuciosa a respeito de pressupostos amplamente estudados no meio acadêmico. A partir do apoio técnico em gestão e do incentivo financeiro das escolas atendidas pelo Programa pretende-se melhorar o rendimento escolar dos alunos em matemática e língua portuguesa. Com foco nas escolas de São Paulo e Rio de Janeiro participantes do Programa entre 2010 e 2012, é possível verificar impactos médios significativos no rendimento escolar das escolas participantes, com exceção do grupo de escolas de São Paulo - Capital. A alocação de recurso financeiro pelos diretores das escolas permite uma associação com a função de produção escolar. Partindo-se da hipótese de que essa função teria como insumos as categorias atribuídas pela escola no que se refere a Gestão Escolar (infraestrutura), Incentivo Professor (bonificações e premiações aos professores) e Incentivo Aluno (bonificações e premiações aos alunos) pode-se estudar o comportamento do rendimento obtido em função dos insumos empregados. A análise da alocação indicou que a variável de Investimento no Incentivo Aluno é significativo para explicar o rendimento escolar pelos exames aplicados pelo Instituto Unibanco para o ano corrente. Quando analisado o efeito do investimento acumulado no tempo, a categoria de gestão escolar se mostrou significativo para explicar o rendimento obtido pelos exames aplicados pelo Instituto Unibanco. Os diretores das escolas parecem que conhecem a função de produção escolar e sabem que investimentos na categoria de Gestão Escolar (infraestrutura) dão resultado no longo prazo, enquanto que investimentos no Incentivo Aluno apresentam mais resultado no curto prazo.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Water security which is essential to life and livelihood, health and sanitation, is determined not only by the water resource, but also by the quality of water, the ability to store surplus from precipitation and runoff, as well as access to and affordability of supply. All of these measures have financial implications for national budgets. The water sector in the context of the assessment and discussion on the impact of climate change in this paper includes consideration of the existing as well as the projected available water resource and the demand in terms of: quantity and quality of surface and ground water, water supply infrastructure - collection, storage, treatment, distribution, and potential for adaptation. Wastewater management infrastructure is also considered a component of the water sector. Saint Vincent and the Grenadines has two distinct hydrological regimes: mainland St Vincent is one of the wetter islands of the eastern Caribbean whereas the Grenadines have a drier climate than St Vincent. Surface water is the primary source of water supply on St Vincent, whereas the Grenadines depend on man-made catchments, rainwater harvesting, wells, and desalination. The island state is considered already water stressed as marked seasonality in rainfall, inadequate supply infrastructure, and institutional capacity constrains water supply. Economic modelling approaches were implemented to estimate sectoral demand and supply between 2011 and 2050. Residential, tourism and domestic demand were analysed for the A2, B2 and BAU scenarios. In each of the three scenarios – A2, B2 and BAU Saint Vincent and the Grenadines will have a water gap represented by the difference between the two curves during the forecast period of 2011 and 2050. The amount of water required increases steadily between 2011 and 2050 implying an increasing demand on the country‘s resources as reflected by the fact that the water supply that is available cannot respond adequately to the demand. The Global Water Partnership in its 2005 policy brief suggested that the best way for countries to build the capacity to adapt to climate change will be to improve their ability to cope with today‘s climate variability (GWP, 2005). This suggestion is most applicable for St Vincent and the Grenadines, as the variability being experienced has already placed the island nation under water stress. Strategic priorities should therefore be adopted to increase water production, increase efficiency, strengthen the institutional framework, and decrease wastage. Cost benefit analysis was stymied by data availability, but the ―no-regrets approach‖ which intimates that adaptation measures will be beneficial to the land, people and economy of Saint Vincent and the Grenadines with or without climate change should be adopted.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

[ES] El Detector de Efectos Stroop (SED - Stroop Effect Detector), es una herramienta informática de asistencia, desarrollada a través del programa de investigación de Desarrollo Tecnológico Social de la Universidad de Las Palmas de Gran Canaria, que ayuda a profesionales del sector neuropsicológico a identificar problemas en la corteza orbitofrontal de un individuo, usándose para ello la técnica ideada por Schenker en 1998. Como base metodológica, se han utilizado los conocimientos adquiridos en las diferentes materias de la adaptación al grado en Ingeniería Informática como Gestión del Software, Arquitectura del Software y Desarrollo de Interfaces de Usuario así como conocimiento adquirido con anterioridad en asignaturas de Programación e Ingeniería del Software I y II. Como para realizar este proyecto sólo el conocimiento informático no era suficiente, he realizado una labor de investigación acerca del problema, teniendo que recopilar información de otros documentos científicos que abordan el tema, consultas a profesionales del sector como son el Doctor Don Ayoze Nauzet González Hernández, neurólogo del hospital Doctor Negrín de Las Palmas de Gran Canaria y el psicólogo Don José Manuel Rodríguez Pellejero que habló de este problema en clase del máster de Formación del Profesorado y que actualmente estoy cursando. Este trabajo presenta el test de Stroop con las dos versiones de Schenker: RCN (Reading Color Names) y NCW (Naming Colored Words). Como norma general, ambas pruebas presentan ante los sujetos estudios palabras (nombres de colores) escritas con la tinta de colores diferentes. De esta forma, el RCN consiste en leer la palabra escrita omitiendo la tonalidad de su fuente e intentando que no nos influya. Por el contrario, el NCW requiere enunciar el nombre del color de la tinta con la que está escrita la palabra sin que nos influya que ésta última sea el nombre de un color.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

We propose an innovative, integrated, cost-effective health system to combat major non-communicable diseases (NCDs), including cardiovascular, chronic respiratory, metabolic, rheumatologic and neurologic disorders and cancers, which together are the predominant health problem of the 21st century. This proposed holistic strategy involves comprehensive patient-centered integrated care and multi-scale, multi-modal and multi-level systems approaches to tackle NCDs as a common group of diseases. Rather than studying each disease individually, it will take into account their intertwined gene-environment, socio-economic interactions and co-morbidities that lead to individual-specific complex phenotypes. It will implement a road map for predictive, preventive, personalized and participatory (P4) medicine based on a robust and extensive knowledge management infrastructure that contains individual patient information. It will be supported by strategic partnerships involving all stakeholders, including general practitioners associated with patient-centered care. This systems medicine strategy, which will take a holistic approach to disease, is designed to allow the results to be used globally, taking into account the needs and specificities of local economies and health systems.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Zur administrativen Unterstützung von Lehr- und Lernprozessen werden E-Learning-Plattformen eingesetzt, die auf der Grundlage des Internet Funktionen zur Distribution von Lehr- und Lernmaterialien und zur Kommunikation zwischen Lehrenden und Lernenden anbieten. Zahlreiche wissenschaftliche Beiträge und Marktstudien beschäftigen sich mit der multikriteriellen Evaluation dieser Softwareprodukte zur informatorischen Fundierung strategischer Investitionsentscheidungen. Demgegenüber werden Instrumente zum kostenorientierten Controlling von E-Learning-Plattformen allenfalls marginal thematisiert. Dieser Beitrag greift daher das Konzept der Total Cost of Ownership (TCO) auf, das einen methodischen Ansatzpunkt zur Schaffung von Kostentransparenz von E-Learning-Plattformen bildet. Aufbauend auf den konzeptionellen Grundlagen werden Problembereiche und Anwendungspotenziale für das kostenorientierte Controlling von LMS identifiziert. Zur softwaregestützten Konstruktion und Analyse von TCO-Modellen wird das Open Source-Werkzeug TCO-Tool eingeführt und seine Anwendung anhand eines synthetischen Fallbeispiels erörtert. Abschließend erfolgt die Identifikation weiterführender Entwicklungsperspektiven des TCO-Konzepts im Kontext des E-Learning. Die dargestellte Thematik ist nicht nur von theoretischem Interesse, sondern adressiert auch den steigenden Bedarf von Akteuren aus der Bildungspraxis nach Instrumenten zur informatorischen Fundierung von Investitions- und Desinvestitionsentscheidungen im Umfeld des E-Learning.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

En este trabajo de tesis se propone un esquema de votación telemática, de carácter paneuropeo y transnacional, que es capaz de satisfacer las más altas exigencias en materia de seguridad. Este enfoque transnacional supone una importante novedad que obliga a identificar a los ciudadanos más allá de las fronteras de su país, exigencia que se traduce en la necesidad de que todos los ciudadanos europeos dispongan de una identidad digital y en que ésta sea reconocida más allá de las fronteras de su país de origen. Bajo estas premisas, la propuesta recogida en esta tesis se aborda desde dos vertientes complementarias: por una parte, el diseño de un esquema de votación capaz de conquistar la confianza de gobiernos y ciudadanos europeos y, por otra, la búsqueda de una respuesta al problema de interoperabilidad de Sistemas de Gestión de Identidad (IDMs), en consonancia con los trabajos que actualmente realiza la UE para la integración de los servicios proporcionados por las Administraciones Públicas de los distintos países europeos. El punto de partida de este trabajo ha sido la identificación de los requisitos que determinan el adecuado funcionamiento de un sistema de votación telemática para, a partir de ellos,proponer un conjunto de elementos y criterios que permitan, por una parte, establecer comparaciones entre distintos sistemas telemáticos de votación y, por otra, evaluar la idoneidad del sistema propuesto. A continuación se han tomado las más recientes y significativas experiencias de votación telemática llevadas a cabo por diferentes países en la automatización de sus procesos electorales, analizándolas en profundidad para demostrar que, incluso en los sistemas más recientes, todavía subsisten importantes deficiencias relativas a la seguridad. Asimismo, se ha constatado que un sector importante de la población se muestra receloso y, a menudo, cuestiona la validez de los resultados publicados. Por tanto, un sistema que aspire a ganarse la confianza de ciudadanos y gobernantes no sólo debe operar correctamente, trasladando los procesos tradicionales de votación al contexto telemático, sino que debe proporcionar mecanismos adicionales que permitan superar los temores que inspira el nuevo sistema de votación. Conforme a este principio, el enfoque de esta tesis, se orienta, en primer lugar, hacia la creación de pruebas irrefutables, entendibles y auditables a lo largo de todo el proceso de votación, que permitan demostrar con certeza y ante todos los actores implicados en el proceso (gobierno, partidos políticos, votantes, Mesa Electoral, interventores, Junta Electoral,jueces, etc.) que los resultados publicados son fidedignos y que no se han violado los principios de anonimato y de “una persona, un voto”. Bajo este planteamiento, la solución recogida en esta tesis no sólo prevé mecanismos para minimizar el riesgo de compra de votos, sino que además incorpora mecanismos de seguridad robustos que permitirán no sólo detectar posibles intentos de manipulación del sistema, sino también identificar cuál ha sido el agente responsable. De forma adicional, esta tesis va más allá y traslada el escenario de votación a un ámbito paneuropeo donde aparecen nuevos problemas. En efecto, en la actualidad uno de los principales retos a los que se enfrentan las votaciones de carácter transnacional es sin duda la falta de procedimientos rigurosos y dinámicos para la actualización sincronizada de los censos de votantes de los distintos países que evite la presencia de errores que redunden en la incapacidad de controlar que una persona emita más de un voto, o que se vea impedido del todo a ejercer su derecho. Este reconocimiento de la identidad transnacional requiere la interoperabilidad entre los IDMs de los distintos países europeos. Para dar solución a este problema, esta tesis se apoya en las propuestas emergentes en el seno de la UE, que previsiblemente se consolidarán en los próximos años, tanto en materia de identidad digital (con la puesta en marcha de la Tarjeta de Ciudadano Europeo) como con el despliegue de una infraestructura de gestión de identidad que haga posible la interoperabilidad de los IDMs de los distintos estados miembros. A partir de ellas, en esta tesis se propone una infraestructura telemática que facilita la interoperabilidad de los sistemas de gestión de los censos de los distintos estados europeos en los que se lleve a cabo conjuntamente la votación. El resultado es un sistema versátil, seguro, totalmente robusto, fiable y auditable que puede ser aplicado en elecciones paneuropeas y que contempla la actualización dinámica del censo como una parte crítica del proceso de votación. ABSTRACT: This Ph. D. dissertation proposes a pan‐European and transnational system of telematic voting that is capable of meeting the strictest security standards. This transnational approach is a significant innovation that entails identifying citizens beyond the borders of their own country,thus requiring that all European citizens must have a digital identity that is recognized beyond the borders of their country of origin. Based on these premises, the proposal in this thesis is analyzed in two mutually‐reinforcing ways: first, a voting system is designed that is capable of winning the confidence of European governments and citizens and, second, a solution is conceived for the problem of interoperability of Identity Management Systems (IDMs) that is consistent with work being carried out by the EU to integrate the services provided by the public administrations of different European countries. The starting point of this paper is to identify the requirements for the adequate functioning of a telematic voting system and then to propose a set of elements and criteria that will allow for making comparisons between different such telematic voting systems for the purpose of evaluating the suitability of the proposed system. Then, this thesis provides an in‐depth analysis of most recent significant experiences in telematic voting carried out by different countries with the aim of automating electoral processes, and shows that even the most recent systems have significant shortcomings in the realm of security. Further, a significant portion of the population has shown itself to be wary,and they often question the validity of the published results. Therefore, a system that aspires to win the trust of citizens and leaders must not only operate correctly by transferring traditional voting processes into a telematic environment, but must also provide additional mechanisms that can overcome the fears aroused by the new voting system. Hence, this thesis focuses, first, on creating irrefutable, comprehensible and auditable proof throughout the voting process that can demonstrate to all actors in the process – the government, political parties, voters, polling station workers, electoral officials, judges, etc. ‐that the published results are accurate and that the principles of anonymity and one person,one vote, have not been violated. Accordingly, the solution in this thesis includes mechanisms to minimize the risk of vote buying, in addition to robust security mechanisms that can not only detect possible attempts to manipulate the system, but also identify the responsible party. Additionally, this thesis goes one step further and moves the voting scenario to a pan‐European scale, in which new problems appear. Indeed, one of the major challenges at present for transnational voting processes is the lack of rigorous and dynamic procedures for synchronized updating of different countries’ voter rolls, free from errors that may make the system unable to keep an individual from either casting more than one vote, or from losing the effective exercise of the right to vote. This recognition of transnational identity requires interoperability between the IDMs of different European countries. To solve the problem, this thesis relies on proposals emerging within the EU that are expected to take shape in the coming years, both in digital identity – with the launch of the European Citizen Card – and in the deployment of an identity management infrastructure that will enable interoperability of the IDMs of different member states. Based on these, the thesis proposes a telematic infrastructure that will achieve interoperability of the census management systems of European states in which voting processes are jointly carried out. The result is a versatile, secure, totally robust, reliable and auditable system that can be applied in pan‐European election, and that includes dynamic updating of the voter rolls as a critical part of the voting process.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

PAMELA (Phased Array Monitoring for Enhanced Life Assessment) SHMTM System is an integrated embedded ultrasonic guided waves based system consisting of several electronic devices and one system manager controller. The data collected by all PAMELA devices in the system must be transmitted to the controller, who will be responsible for carrying out the advanced signal processing to obtain SHM maps. PAMELA devices consist of hardware based on a Virtex 5 FPGA with a PowerPC 440 running an embedded Linux distribution. Therefore, PAMELA devices, in addition to the capability of performing tests and transmitting the collected data to the controller, have the capability of perform local data processing or pre-processing (reduction, normalization, pattern recognition, feature extraction, etc.). Local data processing decreases the data traffic over the network and allows CPU load of the external computer to be reduced. Even it is possible that PAMELA devices are running autonomously performing scheduled tests, and only communicates with the controller in case of detection of structural damages or when programmed. Each PAMELA device integrates a software management application (SMA) that allows to the developer downloading his own algorithm code and adding the new data processing algorithm to the device. The development of the SMA is done in a virtual machine with an Ubuntu Linux distribution including all necessary software tools to perform the entire cycle of development. Eclipse IDE (Integrated Development Environment) is used to develop the SMA project and to write the code of each data processing algorithm. This paper presents the developed software architecture and describes the necessary steps to add new data processing algorithms to SMA in order to increase the processing capabilities of PAMELA devices.An example of basic damage index estimation using delay and sum algorithm is provided.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

En el presente trabajo se plantea el estudio de las características acústicas del ladrillo artesanal e industrial fabricado en Ecuador, considerando las características particulares respecto a la calidad de la materia prima, además del hecho de que a nivel artesanal su producción no está regularizada, si bien existen una serie de reglamentos no siempre son acatados por el productor artesanal, lo que hace que presente propiedades particulares. La idea principal de este trabajo es generar datos referenciales e iniciales, sobre las propiedades acústicas del ladrillo artesanal e industrial ya que en Ecuador, no existe ningún estudio de esta naturaleza sobre el tema. Además de crear los mecanismos necesarios para una posible ampliación del estudio a otros materiales propios de Ecuador que permitan generar una base de datos sobre sus propiedades acústicas. Otro aspecto importante sobre esta investigación es el familiarizarse con el uso de técnicas de medición, manejo de equipamiento y software diverso, del manejo y comparación de normativa. ABSTRACT. The purpose of this paper is the study of the acoustic characteristics of artisanal and industrial brick manufactured in Ecuador, considering the particular characteristics regarding the quality of raw materials, besides the fact that artisanal production level is unregulated, although there are a number of regulations are not always complied with by the artisan producer, which makes this particular properties. The main idea of this paper is to generate reference and baseline data on the acoustic properties of artisanal and industrial brick as in Ecuador, there is no study of this nature on the subject. In addition to creating the necessary mechanisms for a possible extension of the study to other materials from Ecuador that will generate a database on its acoustic properties. Another important aspect of this research is to get familiar with the use of measurement techniques, equipment and miscellaneous management software, management and comparison of legislation.