999 resultados para pre-lecture assignment


Relevância:

20.00% 20.00%

Publicador:

Resumo:

Consider the problem of scheduling a set of implicit-deadline sporadic tasks to meet all deadlines on a heterogeneous multiprocessor platform. We consider a restricted case where the maximum utilization of any task on any processor in the system is no greater than one. We use an algorithm proposed in [1] (we refer to it as LP-EE) from state-of-the-art for assigning tasks to heterogeneous multiprocessor platform and (re-)prove its performance guarantee for this restricted case but for a stronger adversary. We show that if a task set can be scheduled to meet deadlines on a heterogeneous multiprocessor platform by an optimal task assignment scheme that allows task migrations then LP-EE meets deadlines as well with no migrations if given processors twice as fast.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Consider the problem of scheduling real-time tasks on a multiprocessor with the goal of meeting deadlines. Tasks arrive sporadically and have implicit deadlines, that is, the deadline of a task is equal to its minimum inter-arrival time. Consider this problem to be solved with global static-priority scheduling. We present a priority-assignment scheme with the property that if at most 38% of the processing capacity is requested then all deadlines are met.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The objective of this study was to investigate an association between pre-harvest sugarcane burning and respiratory diseases in children under five years of age. The following data were collected in five schools in the city of Araraquara, SP, Southeastern Brazil, between March and June 2009: daily records of absences and the reasons stated for these absences, total concentration of suspended particulate matter (µg/m3), and air humidity. The relationship between the percentage of school absences due to respiratory problems and the concentration of particulate matter in March and from April to June presented a distinct behavior: absences increased alongside the increase in particulate matter concentration. The use of school absences as indicators of this relationship is an innovative approach.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Additional apple juice extraction with pulsed electric field pretreated apple cubes towards control samples is evaluated. Monopolar and bipolar shaped pulses are compared and their effect is studied with variation of electric field, pulse width and number of pulses. Variation of electric field strength is ranged from 100 V/cm to 1300 V/cm, pulse width from 20 mu s to 300 mu s and number of pulses from 10 to 200, at frequency of 200Hz. Two pulse trains separated by 1 second are applied to all samples. Bipolar pulses showed higher apple juice yields with all studied parameters. Calculation of specific energies consumed was assessed and a threshold where higher energy inputs do not increase juice yield is found for a number of used parameters. Qualitative parameters of total soluble matter (Brix) and absorbance at 390 nm wavelength were determined for each sample and results show that no substantial differences are found for PEF pre-treated and control samples.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

O trabalho apresentado centra-se na determinação dos custos de construção de condutas de pequenos e médios diâmetros em Polietileno de Alta Densidade (PEAD) para saneamento básico, tendo como base a metodologia descrita no livro Custos de Construção e Exploração – Volume 9 da série Gestão de Sistemas de Saneamento Básico, de Lencastre et al. (1994). Esta metodologia descrita no livro já referenciado, nos procedimentos de gestão de obra, e para tal foram estimados custos unitários de diversos conjuntos de trabalhos. Conforme Lencastre et al (1994), “esses conjuntos são referentes a movimentos de terras, tubagens, acessórios e respetivos órgãos de manobra, pavimentações e estaleiro, estando englobado na parte do estaleiro trabalhos acessórios correspondentes à obra.” Os custos foram obtidos analisando vários orçamentos de obras de saneamento, resultantes de concursos públicos de empreitadas recentemente realizados. Com vista a tornar a utilização desta metodologia numa ferramenta eficaz, foram organizadas folhas de cálculo que possibilitam obter estimativas realistas dos custos de execução de determinada obra em fases anteriores ao desenvolvimento do projeto, designadamente numa fase de preparação do plano diretor de um sistema ou numa fase de elaboração de estudos de viabilidade económico-financeiros, isto é, mesmo antes de existir qualquer pré-dimensionamento dos elementos do sistema. Outra técnica implementada para avaliar os dados de entrada foi a “Análise Robusta de Dados”, Pestana (1992). Esta metodologia permitiu analisar os dados mais detalhadamente antes de se formularem hipóteses para desenvolverem a análise de risco. A ideia principal é o exame bastante flexível dos dados, frequentemente antes mesmo de os comparar a um modelo probabilístico. Assim, e para um largo conjunto de dados, esta técnica possibilitou analisar a disparidade dos valores encontrados para os diversos trabalhos referenciados anteriormente. Com os dados recolhidos, e após o seu tratamento, passou-se à aplicação de uma metodologia de Análise de Risco, através da Simulação de Monte Carlo. Esta análise de risco é feita com recurso a uma ferramenta informática da Palisade, o @Risk, disponível no Departamento de Engenharia Civil. Esta técnica de análise quantitativa de risco permite traduzir a incerteza dos dados de entrada, representada através de distribuições probabilísticas que o software disponibiliza. Assim, para por em prática esta metodologia, recorreu-se às folhas de cálculo que foram realizadas seguindo a abordagem proposta em Lencastre et al (1994). A elaboração e a análise dessas estimativas poderão conduzir à tomada de decisões sobre a viabilidade da ou das obras a realizar, nomeadamente no que diz respeito aos aspetos económicos, permitindo uma análise de decisão fundamentada quanto à realização dos investimentos.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Pultruded products are being targeted by a growing demand due to its excellent mechanical properties and low chemical reactivity, ensuring a low level of maintenance operations and allowing an easier assembly operation process than equivalent steel bars. In order to improve the mechanical drawing process and solve some acoustic and thermal insulation problems, pultruded pipes of glass fibre reinforced plastics (GFRF) can be filled with special products that increase their performance regarding the issues previously referred. The great challenge of this work was drawing a new equipment able to produce pultruded pipes filled with cork or polymeric pre-shaped bars as a continuous process. The project was carried out successfully and the new equipment was built and integrated in the pultrusion equipment already existing, allowing to obtain news products with higher added-value in the market, covering some needs previously identified in the field of civil construction.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The ability to solve conflicting beliefs is crucial for multi- agent systems where the information is dynamic, incomplete and dis- tributed over a group of autonomous agents. The proposed distributed belief revision approach consists of a distributed truth maintenance sy- stem and a set of autonomous belief revision methodologies. The agents have partial views and, frequently, hold disparate beliefs which are au- tomatically detected by system’s reason maintenance mechanism. The nature of these conflicts is dynamic and requires adequate methodolo- gies for conflict resolution. The two types of conflicting beliefs addressed in this paper are Context Dependent and Context Independent Conflicts which result, in the first case, from the assignment, by different agents, of opposite belief statuses to the same belief, and, in the latter case, from holding contradictory distinct beliefs. The belief revision methodology for solving Context Independent Con- flicts is, basically, a selection process based on the assessment of the cre- dibility of the opposing belief statuses. The belief revision methodology for solving Context Dependent Conflicts is, essentially, a search process for a consensual alternative based on a “next best” relaxation strategy.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

This paper proposes and reports the development of an open source solution for the integrated management of Infrastructure as a Service (IaaS) cloud computing resources, through the use of a common API taxonomy, to incorporate open source and proprietary platforms. This research included two surveys on open source IaaS platforms (OpenNebula, OpenStack and CloudStack) and a proprietary platform (Parallels Automation for Cloud Infrastructure - PACI) as well as on IaaS abstraction solutions (jClouds, Libcloud and Deltacloud), followed by a thorough comparison to determine the best approach. The adopted implementation reuses the Apache Deltacloud open source abstraction framework, which relies on the development of software driver modules to interface with different IaaS platforms, and involved the development of a new Deltacloud driver for PACI. The resulting interoperable solution successfully incorporates OpenNebula, OpenStack (reuses pre-existing drivers) and PACI (includes the developed Deltacloud PACI driver) nodes and provides a Web dashboard and a Representational State Transfer (REST) interface library. The results of the exchanged data payload and time response tests performed are presented and discussed. The conclusions show that open source abstraction tools like Deltacloud allow the modular and integrated management of IaaS platforms (open source and proprietary), introduce relevant time and negligible data overheads and, as a result, can be adopted by Small and Medium-sized Enterprise (SME) cloud providers to circumvent the vendor lock-in problem whenever service response time is not critical.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The effect of pre-meal tomato intake in the anthropometric indices and blood levels of triglycerides, cholesterol, glucose, and uric acid of a young women population (n=35, 19.6 ± 1.3 years) was evaluated. During 4 weeks, daily, participants ingested a raw ripe tomato (~90 g) before lunch. Their anthropometric and biochemical parameters were measured repeatedly during the follow-up time. At the end of the 4 weeks, significant reductions were observed on body weight (-1.09 ± 0.12 kg on average), % fat (-1.54 ± 0.52%), fasting blood glucose (-5.29 ± 0.80 mg/dl), triglycerides (-8.31 ± 1.34 mg/dl), cholesterol (-10.17 ± 1.21 mg/ dl), and uric acid (-0.16 ± 0.04 mg/dl) of the participants. The tomato pre-meal ingestion seemed to interfere positively in body weight, fat percentage, and blood levels of glucose, triglycerides, cholesterol, and uric acid of the young adult women that participated in this study.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Consider the problem of assigning implicit-deadline sporadic tasks on a heterogeneous multiprocessor platform comprising two different types of processors—such a platform is referred to as two-type platform. We present two low degree polynomial time-complexity algorithms, SA and SA-P, each providing the following guarantee. For a given two-type platform and a task set, if there exists a task assignment such that tasks can be scheduled to meet deadlines by allowing them to migrate only between processors of the same type (intra-migrative), then (i) using SA, it is guaranteed to find such an assignment where the same restriction on task migration applies but given a platform in which processors are 1+α/2 times faster and (ii) SA-P succeeds in finding a task assignment where tasks are not allowed to migrate between processors (non-migrative) but given a platform in which processors are 1+α times faster. The parameter 0<α≤1 is a property of the task set; it is the maximum of all the task utilizations that are no greater than 1. We evaluate average-case performance of both the algorithms by generating task sets randomly and measuring how much faster processors the algorithms need (which is upper bounded by 1+α/2 for SA and 1+α for SA-P) in order to output a feasible task assignment (intra-migrative for SA and non-migrative for SA-P). In our evaluations, for the vast majority of task sets, these algorithms require significantly smaller processor speedup than indicated by their theoretical bounds. Finally, we consider a special case where no task utilization in the given task set can exceed one and for this case, we (re-)prove the performance guarantees of SA and SA-P. We show, for both of the algorithms, that changing the adversary from intra-migrative to a more powerful one, namely fully-migrative, in which tasks can migrate between processors of any type, does not deteriorate the performance guarantees. For this special case, we compare the average-case performance of SA-P and a state-of-the-art algorithm by generating task sets randomly. In our evaluations, SA-P outperforms the state-of-the-art by requiring much smaller processor speedup and by running orders of magnitude faster.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Consider the problem of assigning implicit-deadline sporadic tasks on a heterogeneous multiprocessor platform comprising a constant number (denoted by t) of distinct types of processors—such a platform is referred to as a t-type platform. We present two algorithms, LPGIM and LPGNM, each providing the following guarantee. For a given t-type platform and a task set, if there exists a task assignment such that tasks can be scheduled to meet their deadlines by allowing them to migrate only between processors of the same type (intra-migrative), then: (i) LPGIM succeeds in finding such an assignment where the same restriction on task migration applies (intra-migrative) but given a platform in which only one processor of each type is 1 + α × t-1/t times faster and (ii) LPGNM succeeds in finding a task assignment where tasks are not allowed to migrate between processors (non-migrative) but given a platform in which every processor is 1 + α times faster. The parameter α is a property of the task set; it is the maximum of all the task utilizations that are no greater than one. To the best of our knowledge, for t-type heterogeneous multiprocessors: (i) for the problem of intra-migrative task assignment, no previous algorithm exists with a proven bound and hence our algorithm, LPGIM, is the first of its kind and (ii) for the problem of non-migrative task assignment, our algorithm, LPGNM, has superior performance compared to state-of-the-art.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The multiprocessor scheduling scheme NPS-F for sporadic tasks has a high utilisation bound and an overall number of preemptions bounded at design time. NPS-F binpacks tasks offline to as many servers as needed. At runtime, the scheduler ensures that each server is mapped to at most one of the m processors, at any instant. When scheduled, servers use EDF to select which of their tasks to run. Yet, unlike the overall number of preemptions, the migrations per se are not tightly bounded. Moreover, we cannot know a priori which task a server will be currently executing at the instant when it migrates. This uncertainty complicates the estimation of cache-related preemption and migration costs (CPMD), potentially resulting in their overestimation. Therefore, to simplify the CPMD estimation, we propose an amended bin-packing scheme for NPS-F allowing us (i) to identify at design time, which task migrates at which instant and (ii) bound a priori the number of migrating tasks, while preserving the utilisation bound of NPS-F.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Consider scheduling of real-time tasks on a multiprocessor where migration is forbidden. Specifically, consider the problem of determining a task-to-processor assignment for a given collection of implicit-deadline sporadic tasks upon a multiprocessor platform in which there are two distinct types of processors. For this problem, we propose a new algorithm, LPC (task assignment based on solving a Linear Program with Cutting planes). The algorithm offers the following guarantee: for a given task set and a platform, if there exists a feasible task-to-processor assignment, then LPC succeeds in finding such a feasible task-to-processor assignment as well but on a platform in which each processor is 1.5 × faster and has three additional processors. For systems with a large number of processors, LPC has a better approximation ratio than state-of-the-art algorithms. To the best of our knowledge, this is the first work that develops a provably good real-time task assignment algorithm using cutting planes.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The currently used pre-exposure anti-rabies immunization schedule in Brazil is the one called 3+1, employing suckling mouse brain vaccine (3 doses on alternate days and the last one on day 30). Although satisfactory results were obtained in well controlled experimental groups using this immunization schedule, in our routine practice, VNA levels lower than 0.5 IU/ml are frequently found. We studied the pre-exposure 3+1 schedule under field conditions in different cities on the State of São Paulo, Brazil, under variable and sometimes adverse circumstances, such as the use of different batches of vaccine with different titers, delivered, stored and administered under local conditions. Fifty out of 256 serum samples (19.5%) showed VNA titers lower than 0.5 IU/ml, but they were not distributed homogeneously among the localities studied. While in some cities the results were completely satisfactory, in others almost 40% did not attain the minimum VNA titer required. The results presented here, considered separately, question our currently used procedures for human pre-exposure anti-rabies immunization. The reasons determining this situation are discussed.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

This study reports preliminary results of virus neutralizing antibody (VNA) titers obtained on different days in the course of human anti-rabies immunization with the 2-1-1 schedule (one dose is given in the right arm and one dose in the left arm at day 0, and one dose is apllied on days 7 and 21), recommended by WHO for post-exposure treatment with cell culture vaccines. A variant schedule (double dose on day zero and another on day 14) was also tested, both employing suckling mouse brain vaccine. A complete seroconversion rate was obtained after only 3 vaccine doses, and almost all patients (11 of 12) presented titers higher than 1.0 IU/ml. Both neutralizing response and seroconversion rates were lower in the group receiving only 3 doses, regardless of the sample collecting day. Although our results are lower than those found with cell culture vaccines, the geometry mean of VNA is fully satisfactory, overcoming the lower limit recommended by WHO of 0.5 IU/ml. The 2-1-1 schedule could be an alternative one for pre exposure immunization, shorter than the classical 3+1 regimen (one dose on days 0, 2, 4 and 30) with only three visits to the doctor, instead of four.