959 resultados para Polynomial distributed lag models


Relevância:

20.00% 20.00%

Publicador:

Resumo:

11th IEEE World Conference on Factory Communication Systems (WFCS 2015). 27 to 29, May, 2015, TII-SS-2: Scheduling and Performance Analysis. Palma de Mallorca, Spain.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

XXXIII Simpósio Brasileiro de Redes de Computadores e Sistemas Distribuídos (SBRC 2015). 15 to 19, May, 2015, III Workshop de Comunicação em Sistemas Embarcados Críticos. Vitória, Brasil.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Distributed real-time systems such as automotive applications are becoming larger and more complex, thus, requiring the use of more powerful hardware and software architectures. Furthermore, those distributed applications commonly have stringent real-time constraints. This implies that such applications would gain in flexibility if they were parallelized and distributed over the system. In this paper, we consider the problem of allocating fixed-priority fork-join Parallel/Distributed real-time tasks onto distributed multi-core nodes connected through a Flexible Time Triggered Switched Ethernet network. We analyze the system requirements and present a set of formulations based on a constraint programming approach. Constraint programming allows us to express the relations between variables in the form of constraints. Our approach is guaranteed to find a feasible solution, if one exists, in contrast to other approaches based on heuristics. Furthermore, approaches based on constraint programming have shown to obtain solutions for these type of formulations in reasonable time.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

23rd Euromicro International Conference on Parallel, Distributed, and Network-Based Processing (PDP 2015). 4 to 6, Mar, 2015. Turku, Finland.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Currently, due to the widespread use of computers and the internet, students are trading libraries for the World Wide Web and laboratories with simulation programs. In most courses, simulators are made available to students and can be used to proof theoretical results or to test a developing hardware/product. Although this is an interesting solution: low cost, easy and fast way to perform some courses work, it has indeed major disadvantages. As everything is currently being done with/in a computer, the students are loosing the “feel” of the real values of the magnitudes. For instance in engineering studies, and mainly in the first years, students need to learn electronics, algorithmic, mathematics and physics. All of these areas can use numerical analysis software, simulation software or spreadsheets and in the majority of the cases data used is either simulated or random numbers, but real data could be used instead. For example, if a course uses numerical analysis software and needs a dataset, the students can learn to manipulate arrays. Also, when using the spreadsheets to build graphics, instead of using a random table, students could use a real dataset based, for instance, in the room temperature and its variation across the day. In this work we present a framework which uses a simple interface allowing it to be used by different courses where the computers are the teaching/learning process in order to give a more realistic feeling to students by using real data. A framework is proposed based on a set of low cost sensors for different physical magnitudes, e.g. temperature, light, wind speed, which are connected to a central server, that the students have access with an Ethernet protocol or are connected directly to the student computer/laptop. These sensors use the communication ports available such as: serial ports, parallel ports, Ethernet or Universal Serial Bus (USB). Since a central server is used, the students are encouraged to use sensor values results in their different courses and consequently in different types of software such as: numerical analysis tools, spreadsheets or simply inside any programming language when a dataset is needed. In order to do this, small pieces of hardware were developed containing at least one sensor using different types of computer communication. As long as the sensors are attached in a server connected to the internet, these tools can also be shared between different schools. This allows sensors that aren't available in a determined school to be used by getting the values from other places that are sharing them. Another remark is that students in the more advanced years and (theoretically) more know how, can use the courses that have some affinities with electronic development to build new sensor pieces and expand the framework further. The final solution provided is very interesting, low cost, simple to develop, allowing flexibility of resources by using the same materials in several courses bringing real world data into the students computer works.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

In this paper, we propose the Distributed using Optimal Priority Assignment (DOPA) heuristic that finds a feasible partitioning and priority assignment for distributed applications based on the linear transactional model. DOPA partitions the tasks and messages in the distributed system, and makes use of the Optimal Priority Assignment (OPA) algorithm known as Audsley’s algorithm, to find the priorities for that partition. The experimental results show how the use of the OPA algorithm increases in average the number of schedulable tasks and messages in a distributed system when compared to the use of Deadline Monotonic (DM) usually favoured in other works. Afterwards, we extend these results to the assignment of Parallel/Distributed applications and present a second heuristic named Parallel-DOPA (P-DOPA). In that case, we show how the partitioning process can be simplified by using the Distributed Stretch Transformation (DST), a parallel transaction transformation algorithm introduced in [1].

Relevância:

20.00% 20.00%

Publicador:

Resumo:

XXXIII Simpósio Brasileiro de Redes de Computadores e Sistemas Distribuídos (SBRC 2015), III Workshop de Comunicação em Sistemas Embarcados Críticos. Vitória, Brasil.

Relevância:

20.00% 20.00%

Publicador:

Relevância:

20.00% 20.00%

Publicador:

Resumo:

On a mobile ad-hoc network environment, where the resources are scarce, the knowledge about the network's link state is essential to optimize the routing procedures. This paper presents a study about different pheromone evaluation models and how they react to possible changes in traffic rate. Observing how the pheromone value on a link changes, it could be possible to identify certain patterns which can indicate the path status. For this study, the behavior of the Ant System evaluation model was compared with a Temporal Active Pheromone model (a biological approach) and a Progressive Pheromone Reduction model with and without a maximum pheromone limit.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Canadian Journal of Civil Engineering 36(10) 1605–16

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Os modelos de maturidade são instrumentos facilitadores da gestão das organizações, incluindo a gestão da sua função sistemas de informação, não sendo exceção as organizações hospitalares. Neste artigo apresenta-se uma investigação inicial que visa o desenvolvimento de um abrangente modelo de maturidade para a gestão dos sistemas de informação hospitalares. O desenvolvimento deste modelo justifica-se porque os modelos de maturidade atuais no domínio da gestão dos sistemas informação hospitalares ainda se encontram numa fase embrionária de desenvolvimento, sobretudo porque são pouco detalhados, não disponibilizam ferramentas para determinação da maturidade e não apresentam as características dos estágios de maturidade estruturadas por diferentes fatores de influência.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

O solo é um recurso multifuncional e vital para a humanidade, apresentando funções ecológicas, técnico-industriais, socioeconómicas e culturais, estabelecendo um vasto capital natural insubstituível. Face à sua taxa de degradação potencialmente rápida que, devido ao crescente desenvolvimento económico e incremento da população mundial, tem vindo a aumentar nas últimas décadas, o solo é, atualmente, um recurso finito e limitado. Devido a esta problemática, o presente documento visa abordar a progressiva preocupação sobre as questões geoambientais e toda a investigação que as envolvem, avaliando o modo como os contaminantes se dispersam pelo solo nas diferentes fases do mesmo (fases sólida, líquida e gasosa). A parte experimental centrou-se na análise da adsorção do benzeno, a partir da determinação das isotérmicas de adsorção. Para tal, foram previamente preparados reatores com calcário, sendo alguns deles previamente contaminados com um biocombustível, biodiesel, a uma concentração constante. Este processo foi monitorizado com base na evolução temporal da concentração na fase gasosa, através da cromatografia gasosa. De entre os objetivos, procurou-se analisar a distribuição dos contaminantes pelas fases constituintes do solo, ajustar os dados experimentais obtidos os modelos matemáticos de Langmuir, Freundlich e Polinomial, e verificar e discutir as soluções mais adequadas.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

This study focus on the probabilistic modelling of mechanical properties of prestressing strands based on data collected from tensile tests carried out in Laboratório Nacional de Engenharia Civil (LNEC), Portugal, for certification purposes, and covers a period of about 9 years of production. The strands studied were produced by six manufacturers from four countries, namely Portugal, Spain, Italy and Thailand. Variability of the most important mechanicalproperties is examined and the results are compared with the recommendations of the ProbabilisticModel Code, as well as the Eurocodes and earlier studies. The obtained results show a very low variability which, of course, benefits structural safety. Based on those results, probabilistic modelsfor the most important mechanical properties of prestressing strands are proposed.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Smart Grids (SGs) have emerged as the new paradigm for power system operation and management, being designed to include large amounts of distributed energy resources. This new paradigm requires new Energy Resource Management (ERM) methodologies considering different operation strategies and the existence of new management players such as several types of aggregators. This paper proposes a methodology to facilitate the coalition between distributed generation units originating Virtual Power Players (VPP) considering a game theory approach. The proposed approach consists in the analysis of the classifications that were attributed by each VPP to the distributed generation units, as well as in the analysis of the previous established contracts by each player. The proposed classification model is based in fourteen parameters including technical, economical and behavioural ones. Depending of the VPP strategies, size and goals, each parameter has different importance. VPP can also manage other type of energy resources, like storage units, electric vehicles, demand response programs or even parts of the MV and LV distribution network. A case study with twelve VPPs with different characteristics and one hundred and fifty real distributed generation units is included in the paper.