941 resultados para Multiprocessor scheduling with resource sharing
Resumo:
The fast developing international trade of products based on traditional knowledge and their value chains has become an important aspect of the ethnopharmacological debate. The structure and diversity of value chains and their impact on the phytochemical composition of herbal medicinal products has been overlooked in the debate about quality problems in transnational trade. Different government policies and regulations governing trade in herbal medicinal products impact on such value chains. Medicinal Rhodiola species, including Rhodiola rosea L. and Rhodiola crenulata (Hook.f. & Thomson) H.Ohba, have been used widely in Europe and Asia as traditional herbal medicines with numerous claims for their therapeutic effects. Faced with resource depletion and environment destruction, R. rosea and R. crenulata are becoming endangered, making them more economically valuable to collectors and middlemen, and also increasing the risk of adulteration and low quality. We compare the phytochemical differences among Rhodiola raw materials available on the market to provide a practical method for Rhodiola authentication and the detection of potential adulterant compounds. Samples were collected from Europe and Asia and nuclear magnetic resonance spectroscopy coupled with multivariate analysis software and high performance thin layer chromatography techniques were used to analyse the samples. A method was developed to quantify the amount of adulterant species contained within mixtures. We compared the phytochemical composition of collected Rhodiola samples to authenticated samples. Rosavin and rosarin were mainly present in R. rosea whereas crenulatin was only present in R. crenulata. 30% of the Rhodiola samples purchased from the Chinese market were adulterated by other Rhodiola spp. Moreover, 7 % of the raw-material samples were not labelled satifactorily. The utilisation of both 1H-NMR and HPTLC methods provided an integrated analysis of the phytochemical differences and novel identification method for R. rosea and R. crenulata. Using 1H-NMR spectroscopy it was possible to quantify the presence of R. crenulata in admixtures with R. rosea. This quantitative technique could be used in the future to assess a variety of herbal drugs and products. This project also highlights the need to further study the links between producers and consumers in national and trans-national trade.
Resumo:
E-books on their own are complex; they become even more so in the context of course reserves. In FY2016 the Resource Sharing & Reserves and Acquisitions units developed a new workflow for vetting requested e-books to ensure that they were suitable for course reserves (i.e. they permit unlimited simultaneous users) before posting links to them within the university’s online learning management system. In the Spring 2016 semester 46 e-books were vetted through this process, resulting in 18 purchases. Preliminary data analysis sheds light on the suitability of the Libraries’ current e-book collections for course reserves as well as faculty preferences, with potential implications for the Libraries’ ordering process. We hope this lightening talk will generate discussion about these issues among selectors, collection managers, and reserves staff alike.
Resumo:
Relatório EPE - Relatório de estágio em Educação Pré-Escolar:O relatório de estágio para a qualificação profissional foi realizado no âmbito da unidade curricular de Prática Pedagógica Supervisionada na Educação Pré-Escolar, sendo esta parte integrante do Mestrado em Educação Pré-Escolar e Ensino do 1º ciclo do Ensino Básico. Todo o trabalho desenvolvido no presente relatório teve como objetivo dar a conhecer, de forma crítica, reflexiva e sustentada, as experiências vividas pela formanda com o grupo de crianças que acompanhou, num total de 210 horas de estágio, iniciado em fevereiro e terminado em junho do presente ano. A prática pedagógica de um educador de infância influencia diretamente o desenvolvimento das crianças, bem como o desenvolvimento do próprio docente, contituindo para o desenvolvimento de competências fundamentais para a sua prática futura. Com vista a uma planificação focada nas necessidades e interesses evidenciados pelo grupo de crianças, o educador deve observar, planear, agir, avaliar, comunicar e articular, levando a cabo, paralelamente, a investigaçãoação como forma de refletir sobre a sua prática e sobre os efeitos da mesma, em si e nas crianças. Tendo por base estes pressupostos, foram observadas necessidades de desenvolvimento, interesses e resultados de aprendizagem, que permitiram a planificação de atividades a realizar, por forma a atingir os objetivos traçados para o desenvolvimento de capacidades nas crianças. Para isso, recorreu-se a estratégias inovadoras e diversificadas, suportadas pelos modelos curriculares High/Scope e Reggio Emilia, e também pela Metodologia de Trabalho de Projeto. O trabalho desenvolvido em torno da unidade curricular visa competências como a mobilização de saberes, a adoção de estratégias diferenciadas, a tomada de decisões conscientes e adequadas, o desenvolvimento de projetos de investigação e o desenvolvimento e consolidação de competências socioprofissionais e pessoais. Neste sentido, contribuiu para o desenvolvimento pessoal e para o crescimento profissional da mestranda, através das intervenções realizadas com o grupo de crianças. As diferentes etapas do processo educativo acompanharam afincadamente esta etapa, na medida em que contribuiram para um maior conhecimento da formanda quanto à ação e à planificação. Importa salientar que as atividades desenvolvidas ao longo deste período visaram o desenvolvimento de todas as áreas e domínios de conteúdo, nas crianças, com mais ênfase na área da formação pessoal e social e na área do conhecimento do mundo.
Resumo:
Datacenters have emerged as the dominant form of computing infrastructure over the last two decades. The tremendous increase in the requirements of data analysis has led to a proportional increase in power consumption and datacenters are now one of the fastest growing electricity consumers in the United States. Another rising concern is the loss of throughput due to network congestion. Scheduling models that do not explicitly account for data placement may lead to a transfer of large amounts of data over the network causing unacceptable delays. In this dissertation, we study different scheduling models that are inspired by the dual objectives of minimizing energy costs and network congestion in a datacenter. As datacenters are equipped to handle peak workloads, the average server utilization in most datacenters is very low. As a result, one can achieve huge energy savings by selectively shutting down machines when demand is low. In this dissertation, we introduce the network-aware machine activation problem to find a schedule that simultaneously minimizes the number of machines necessary and the congestion incurred in the network. Our model significantly generalizes well-studied combinatorial optimization problems such as hard-capacitated hypergraph covering and is thus strongly NP-hard. As a result, we focus on finding good approximation algorithms. Data-parallel computation frameworks such as MapReduce have popularized the design of applications that require a large amount of communication between different machines. Efficient scheduling of these communication demands is essential to guarantee efficient execution of the different applications. In the second part of the thesis, we study the approximability of the co-flow scheduling problem that has been recently introduced to capture these application-level demands. Finally, we also study the question, "In what order should one process jobs?'' Often, precedence constraints specify a partial order over the set of jobs and the objective is to find suitable schedules that satisfy the partial order. However, in the presence of hard deadline constraints, it may be impossible to find a schedule that satisfies all precedence constraints. In this thesis we formalize different variants of job scheduling with soft precedence constraints and conduct the first systematic study of these problems.
Resumo:
A High-Performance Computing job dispatcher is a critical software that assigns the finite computing resources to submitted jobs. This resource assignment over time is known as the on-line job dispatching problem in HPC systems. The fact the problem is on-line means that solutions must be computed in real-time, and their required time cannot exceed some threshold to do not affect the normal system functioning. In addition, a job dispatcher must deal with a lot of uncertainty: submission times, the number of requested resources, and duration of jobs. Heuristic-based techniques have been broadly used in HPC systems, at the cost of achieving (sub-)optimal solutions in a short time. However, the scheduling and resource allocation components are separated, thus generates a decoupled decision that may cause a performance loss. Optimization-based techniques are less used for this problem, although they can significantly improve the performance of HPC systems at the expense of higher computation time. Nowadays, HPC systems are being used for modern applications, such as big data analytics and predictive model building, that employ, in general, many short jobs. However, this information is unknown at dispatching time, and job dispatchers need to process large numbers of them quickly while ensuring high Quality-of-Service (QoS) levels. Constraint Programming (CP) has been shown to be an effective approach to tackle job dispatching problems. However, state-of-the-art CP-based job dispatchers are unable to satisfy the challenges of on-line dispatching, such as generate dispatching decisions in a brief period and integrate current and past information of the housing system. Given the previous reasons, we propose CP-based dispatchers that are more suitable for HPC systems running modern applications, generating on-line dispatching decisions in a proper time and are able to make effective use of job duration predictions to improve QoS levels, especially for workloads dominated by short jobs.
Resumo:
Nursing is at the same time a vocation, a profession and a job. By nature, nursing is a moral endeavor, and being a `good nurse` is an issue and an aspiration for professionals. The aim of our qualitative research project carried out with 18 nurse teachers at a university nursing school in Brazil was to identify the ethical image of nursing. In semistructured interviews the participants were asked to choose one of several pictures, to justify their choice and explain what they meant by an ethical nurse. Five different perspectives were revealed: good nurses fulfill their duties correctly; they are proactive patient advocates; they are prepared and available to welcome others as persons; they are talented, competent, and carry out professional duties excellently; and they combine authority with power sharing in patient care. The results point to a transition phase from a historical introjection of religious values of obedience and service to a new sense of a secular, proactive, scientific and professional identity.
Resumo:
In the 70s, a new line of research focused on the study of the influence of the audit report on the decision process of investors, financial analysts and credit analysts. Notwithstanding the numerous studies that have been carried out, results have not been consistent. Given the above, and considering the lack, in Portugal, of a research of this nature, it seems urgent to carry out a study that allows the analysis of the use of the audit report, as well as its influence on the decision making process of Portuguese stakeholders. For that purpose, in the light of the positivist research paradigm, a questionnaire was designed, which was administered by mail and on the Survey Monkey platform to a sample of institutional investors, financial analysts and credit analysts. The statistical analysis of the data obtained was undertaken with resource to the Statistical Package for the Social Sciences and SmartPLS 2.0. Corroborating the literature review and the assumptions of the Agency Theory and the Stakeholder Theory, used in the theoretical framework of analysis, empirical evidence has shown that the audit report influences the decision of institutional investors, financial analysts and credit analysts, and that the opinion expressed in that document is the most determinant factor of this influence. In addition to this factor, it was found that the degree of utilization of the audit report, as well as the value ascribed to this document, determine its influence in the decision process of research groups studied. Only in the case of institutional investors, the results did not reveal a correlation between the utility ascribed to the audit report and the influence of this document in their decision making process. In turn, the statistical inference of the model explaining the degree of use of the audit report revealed that it is conditioned by the perceived quality of the information enclosed in the audit report, the utility assigned to the audit report on the decision process, as well as the relevance of the other sources of information used by stakeholders. Therefore, this study allowed proving the importance of the audit report to its users. As a result, we believe to have filled a gap in national literature and to have contributed to the enhancement of international literature. The importance that this document has for the development of any country is, therefore, shown, and it is urgent to maintain rigor in the selection of its staff, in the development of its standards, and especially in the development of audits. Moreover, we also consider that this research may contribute to the improvement of the audit report, insofar as it will help professional bodies to understand the information needs and perceptions of stakeholders.
Resumo:
Dissertação de Mestrado em Ambiente, Saúde e Segurança.
Resumo:
É considerado um dos problemas na educação o facto de a aprendizagem poder ser muito baseada no uso da teoria. Sendo as experiências do ser humano uma grande parte da forma como vemos e vivemos o mundo, torna-se imprescindível o hábito da prática na formação do nosso conhecimento. Embora a teoria seja sempre necessária na construção de conceitos, deve ser complementada com a experiência de forma a consolidar a aprendizagem para melhor noção da realidade. Esta dissertação descreve uma didáctica para a integração de dispositivos hápticos aplicados à educação, concebendo assim um novo e inovador método de ensino aliado à prática. Dependendo da aceitação por parte dos alunos, este método de uso de tecnologia na educação para fins práticos pode revelar-se revolucionário. Experiências que seriam difíceis de realizar tornam-se possíveis de simular de uma forma real com a ajuda dos sistemas hápticos, em que a variedade de matérias que as aplicações podem simular é vasta. Especificamente, este trabalho fundamenta-se no estudo da aerodinâmica no voo com recurso a uma aplicação desenvolvida para o efeito e à potencialidade do aparelho háptico Novint Falcon, um interface sensorial táctil entre uma pessoa e um computador, de custo relativamente baixo em relação à generalidade dos preços deste tipo de dispositivos. Os testes que estudantes realizaram à aplicação revelaram grande interesse e curiosidade pela novidade da tecnologia háptica e apreciação no conceito do seu uso prático na educação. De forma geral, todos os alunos que participaram no ensaio do programa transmitiram feedback positivo, expressando maior ganho de motivação e desejo em ver este sistema aplicado a outras disciplinas.
Resumo:
O contributo da área de investigação Interacção Humano-Computador (HCI) está patente não só na qualidade da interacção, mas também na diversificação das formas de interacção. A HCI define-se como sendo uma disciplina que se dedica ao desenho, desenvolvimento e implementação de sistemas de computação interactivos para uso humano e estudo dos fenómenos relevantes que os rodeiam. Pretende-se, no âmbito desta tese de mestrado, o desenvolvimento de um Editor Gráfico de Layout Fabril a integrar num SAD para suporte ao Planeamento e Controlo da Produção. O sistema deve ser capaz de gerar um layout fabril do qual constam, entre outros objectos, as representações gráficas e as respectivas características/atributos do conjunto de recursos (máquinas/processadores) existentes no sistema de produção a modelar. O módulo desenvolvido será integrado no projecto de I&D ADSyS (Adaptative Decision Support System for Interactive Scheduling with MetaCognition and User Modeling Experience), melhorando aspectos de interacção referentes ao sistema AutoDynAgents, um dedicado ao escalonamento, planeamento e controlo de produção. Foi realizada a análise de usabilidade a este módulo com a qual se pretendeu realizar a respectiva avaliação, através da realização de um teste de eficiência e do preenchimento de um inquérito, da qual se identificaram um conjunto de melhorias e sugestões a serem consideradas no refinamento deste módulo.
Resumo:
Secure group communication is a paradigm that primarily designates one-to-many communication security. The proposed works relevant to secure group communication have predominantly considered the whole network as being a single group managed by a central powerful node capable of supporting heavy communication, computation and storage cost. However, a typical Wireless Sensor Network (WSN) may contain several groups, and each one is maintained by a sensor node (the group controller) with constrained resources. Moreover, the previously proposed schemes require a multicast routing support to deliver the rekeying messages. Nevertheless, multicast routing can incur heavy storage and communication overheads in the case of a wireless sensor network. Due to these two major limitations, we have reckoned it necessary to propose a new secure group communication with a lightweight rekeying process. Our proposal overcomes the two limitations mentioned above, and can be applied to a homogeneous WSN with resource-constrained nodes with no need for a multicast routing support. Actually, the analysis and simulation results have clearly demonstrated that our scheme outperforms the previous well-known solutions.
Resumo:
Consider the problem of scheduling a set of sporadic tasks on a multiprocessor system to meet deadlines using a tasksplitting scheduling algorithm. Task-splitting (also called semipartitioning) scheduling algorithms assign most tasks to just one processor but a few tasks are assigned to two or more processors, and they are dispatched in a way that ensures that a task never executes on two or more processors simultaneously. A certain type of task-splitting algorithms, called slot-based task-splitting, is of particular interest because of its ability to schedule tasks at high processor utilizations. We present a new schedulability analysis for slot-based task-splitting scheduling algorithms that takes the overhead into account and also a new task assignment algorithm.
Resumo:
Dissertação apresentada à Escola Superior de Comunicação Social como parte dos requisitos para obtenção de grau de mestre em Gestão Estratégica das Relações Públicas.
Resumo:
Mestrado em Engenharia Electrotécnica e de Computadores.Área de Especialização de Sistemas Autónomos
Resumo:
The multiprocessor scheduling scheme NPS-F for sporadic tasks has a high utilisation bound and an overall number of preemptions bounded at design time. NPS-F binpacks tasks offline to as many servers as needed. At runtime, the scheduler ensures that each server is mapped to at most one of the m processors, at any instant. When scheduled, servers use EDF to select which of their tasks to run. Yet, unlike the overall number of preemptions, the migrations per se are not tightly bounded. Moreover, we cannot know a priori which task a server will be currently executing at the instant when it migrates. This uncertainty complicates the estimation of cache-related preemption and migration costs (CPMD), potentially resulting in their overestimation. Therefore, to simplify the CPMD estimation, we propose an amended bin-packing scheme for NPS-F allowing us (i) to identify at design time, which task migrates at which instant and (ii) bound a priori the number of migrating tasks, while preserving the utilisation bound of NPS-F.