965 resultados para Iterative Assignment


Relevância:

10.00% 10.00%

Publicador:

Resumo:

Trabalho Final de Mestrado para obtenção do grau de Mestre em Engenharia de Electrónica e Telecomunicações

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Volatile organic compounds are a common source of groundwater contamination that can be easily removed by air stripping in columns with random packing and using a counter-current flow between the phases. This work proposes a new methodology for column design for any type of packing and contaminant which avoids the necessity of an arbitrary chosen diameter. It also avoids the employment of the usual graphical Eckert correlations for pressure drop. The hydraulic features are previously chosen as a project criterion. The design procedure was translated into a convenient algorithm in C++ language. A column was built in order to test the design, the theoretical steady-state and dynamic behaviour. The experiments were conducted using a solution of chloroform in distilled water. The results allowed for a correction in the theoretical global mass transfer coefficient previously estimated by the Onda correlations, which depend on several parameters that are not easy to control in experiments. For best describe the column behaviour in stationary and dynamic conditions, an original mathematical model was developed. It consists in a system of two partial non linear differential equations (distributed parameters). Nevertheless, when flows are steady, the system became linear, although there is not an evident solution in analytical terms. In steady state the resulting ODE can be solved by analytical methods, and in dynamic state the discretization of the PDE by finite differences allows for the overcoming of this difficulty. To estimate the contaminant concentrations in both phases in the column, a numerical algorithm was used. The high number of resulting algebraic equations and the impossibility of generating a recursive procedure did not allow the construction of a generalized programme. But an iterative procedure developed in an electronic worksheet allowed for the simulation. The solution is stable only for similar discretizations values. If different values for time/space discretization parameters are used, the solution easily becomes unstable. The system dynamic behaviour was simulated for the common liquid phase perturbations: step, impulse, rectangular pulse and sinusoidal. The final results do not configure strange or non-predictable behaviours.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

The ability to solve conflicting beliefs is crucial for multi- agent systems where the information is dynamic, incomplete and dis- tributed over a group of autonomous agents. The proposed distributed belief revision approach consists of a distributed truth maintenance sy- stem and a set of autonomous belief revision methodologies. The agents have partial views and, frequently, hold disparate beliefs which are au- tomatically detected by system’s reason maintenance mechanism. The nature of these conflicts is dynamic and requires adequate methodolo- gies for conflict resolution. The two types of conflicting beliefs addressed in this paper are Context Dependent and Context Independent Conflicts which result, in the first case, from the assignment, by different agents, of opposite belief statuses to the same belief, and, in the latter case, from holding contradictory distinct beliefs. The belief revision methodology for solving Context Independent Con- flicts is, basically, a selection process based on the assessment of the cre- dibility of the opposing belief statuses. The belief revision methodology for solving Context Dependent Conflicts is, essentially, a search process for a consensual alternative based on a “next best” relaxation strategy.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

The iterative simulation of the Brownian bridge is well known. In this article, we present a vectorial simulation alternative based on Gaussian processes for machine learning regression that is suitable for interpreted programming languages implementations. We extend the vectorial simulation of path-dependent trajectories to other Gaussian processes, namely, sequences of Brownian bridges, geometric Brownian motion, fractional Brownian motion, and Ornstein-Ulenbeck mean reversion process.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

This paper discusses the changes brought by the communication revolution in teaching and learning in the scope of LSP. Its aim is to provide an insight on how teaching which was bi-dimensional, turned into a multidimensional system, gathering other complementary resources that have transformed, in a incredibly short time, the ways we receive share and store information, for instance as professionals, and keep in touch with our peers. The increasing rise of electronic publications, the incredible boom of social and professional networks, search engines, blogs, list servs, forums, e-mail blasts, Facebook pages, YouTube contents, Tweets and Apps, have twisted the way information is conveyed. Classes ceased to be predictable and have been empowered by digital platforms, innumerous and different data repositories (TILDE, IATE, LINGUEE, and so many other terminological data banks) that have definitely transformed the academic world in general and tertiary education in particular. There is a bulk of information to be digested by students, who are no longer passive but instead responsible and active for their academic outcomes. The question is whether they possess the tools to select only what is accurate and important for a certain subject or assignment, due to that overflow? Due to the reduction of the number of course years in most degrees, after the implementation of Bologna and the shrinking of the curricula contents, have students the possibility of developing critical thinking? Both teaching and learning rely on digital resources to improve the speed of the spreading of knowledge. But have those changes been effective to promote really communication? Furthermore, with the increasing Apps that have already been developed and will continue to appear for learning foreign languages, for translation among others, will the students feel the need of learning them once they have those Apps. These are some the questions we would like to discuss in our paper.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Trabalho de Projeto submetido à Escola Superior de Teatro e Cinema para cumprimento dos requisitos necessários à obtenção do grau de Mestre em Teatro - especialização em Design de Cena.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Hard real- time multiprocessor scheduling has seen, in recent years, the flourishing of semi-partitioned scheduling algorithms. This category of scheduling schemes combines elements of partitioned and global scheduling for the purposes of achieving efficient utilization of the system’s processing resources with strong schedulability guarantees and with low dispatching overheads. The sub-class of slot-based “task-splitting” scheduling algorithms, in particular, offers very good trade-offs between schedulability guarantees (in the form of high utilization bounds) and the number of preemptions/migrations involved. However, so far there did not exist unified scheduling theory for such algorithms; each one was formulated in its own accompanying analysis. This article changes this fragmented landscape by formulating a more unified schedulability theory covering the two state-of-the-art slot-based semi-partitioned algorithms, S-EKG and NPS-F (both fixed job-priority based). This new theory is based on exact schedulability tests, thus also overcoming many sources of pessimism in existing analysis. In turn, since schedulability testing guides the task assignment under the schemes in consideration, we also formulate an improved task assignment procedure. As the other main contribution of this article, and as a response to the fact that many unrealistic assumptions, present in the original theory, tend to undermine the theoretical potential of such scheduling schemes, we identified and modelled into the new analysis all overheads incurred by the algorithms in consideration. The outcome is a new overhead-aware schedulability analysis that permits increased efficiency and reliability. The merits of this new theory are evaluated by an extensive set of experiments.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Trabalho Final de Mestrado para obtenção do grau de Mestre em Engenharia Mecânica

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Dissertação apresentada na Faculdade de Ciências e Tecnologia da Universidade Nova de Lisboa para obtenção do Grau de Mestre em Engenharia Electrotécnica e de Computadores

Relevância:

10.00% 10.00%

Publicador:

Resumo:

The application of compressive sensing (CS) to hyperspectral images is an active area of research over the past few years, both in terms of the hardware and the signal processing algorithms. However, CS algorithms can be computationally very expensive due to the extremely large volumes of data collected by imaging spectrometers, a fact that compromises their use in applications under real-time constraints. This paper proposes four efficient implementations of hyperspectral coded aperture (HYCA) for CS, two of them termed P-HYCA and P-HYCA-FAST and two additional implementations for its constrained version (CHYCA), termed P-CHYCA and P-CHYCA-FAST on commodity graphics processing units (GPUs). HYCA algorithm exploits the high correlation existing among the spectral bands of the hyperspectral data sets and the generally low number of endmembers needed to explain the data, which largely reduces the number of measurements necessary to correctly reconstruct the original data. The proposed P-HYCA and P-CHYCA implementations have been developed using the compute unified device architecture (CUDA) and the cuFFT library. Moreover, this library has been replaced by a fast iterative method in the P-HYCA-FAST and P-CHYCA-FAST implementations that leads to very significant speedup factors in order to achieve real-time requirements. The proposed algorithms are evaluated not only in terms of reconstruction error for different compressions ratios but also in terms of computational performance using two different GPU architectures by NVIDIA: 1) GeForce GTX 590; and 2) GeForce GTX TITAN. Experiments are conducted using both simulated and real data revealing considerable acceleration factors and obtaining good results in the task of compressing remotely sensed hyperspectral data sets.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Mestrado em Engenharia Civil – Ramo Estruturas

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Mestrado em Engenharia Electrotécnica e de Computadores - Ramo de Sistemas Autónomos

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Este trabalho surge no âmbito da área Electromedicina, uma componente da Engenharia Electrotécnica cada vez mais influente e em permanente desenvolvimento, existindo nela uma constante inovação e tentativa de desenvolvimento e aplicação de novas tecnologias. Este projecto possui como principal objectivo o estudo aprofundado das aplicações da técnica SVD (Singular Value Decomposition), uma poderosa ferramenta matemática que permite a manipulação de sinais através da decomposição de matrizes, ao caso específico do sinal eléctrico obtido através de um electrocardiograma (ECG). Serão discriminados os princípios da operação do sistema eléctrico cardíaco, as principais componentes do sinal ECG (a onda P, o complexo QRS e a onda T) e os fundamentos da técnica SVD. A última fase deste trabalho consistirá na aplicação, em ambiente Matlab, da técnica SVD a sinais ECG concretos, com enfase na sua filtragem, para efeitos de remoção de ruído. De modo verificar as suas vantagens e desvantagens face a outras técnicas, os resultados da filtragem por SVD serão comparados com aqueles obtidos, em condições similares, através da aplicação de um filtro FIR de coeficientes estáticos e de um filtro adaptativo iterativo.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Dissertação apresentada para obtenção do Grau de Doutor em Engenharia Informática, pela Universidade Nova de Lisboa, Faculdade de Ciências e Tecnologia

Relevância:

10.00% 10.00%

Publicador:

Resumo:

This chapter examines the cross-cultural influence of training on the adjustment of international assignees. We focus on the pre-departure training (PDT) before an international assignment. It is an important topic because in the globalized world of today more and more expatriations are needed. The absence of PDT may generate the failure of the expatriation experience. Companies may neglect PDT due to cost reduction practices and ignorance of the need for it. Data were collected through semi-structured interviews to 42 Portuguese international assignees and 18 organizational representatives from nine Portuguese companies. The results suggest that companies should develop PDT programs, particularly when the cultural distance to the host country is bigger and when there is no previous experience of expatriation to that country in the company. The study is original because it details in depth the methods of PDT, its problems, and consequences. Some limitations linked to the research design and detailed in the conclusion should be overcome in future studies.