996 resultados para dynamic dispersion compensation
Resumo:
This paper proposes an efficient scalable Residue Number System (RNS) architecture supporting moduli sets with an arbitrary number of channels, allowing to achieve larger dynamic range and a higher level of parallelism. The proposed architecture allows the forward and reverse RNS conversion, by reusing the arithmetic channel units. The arithmetic operations supported at the channel level include addition, subtraction, and multiplication with accumulation capability. For the reverse conversion two algorithms are considered, one based on the Chinese Remainder Theorem and the other one on Mixed-Radix-Conversion, leading to implementations optimized for delay and required circuit area. With the proposed architecture a complete and compact RNS platform is achieved. Experimental results suggest gains of 17 % in the delay in the arithmetic operations, with an area reduction of 23 % regarding the RNS state of the art. When compared with a binary system the proposed architecture allows to perform the same computation 20 times faster alongside with only 10 % of the circuit area resources.
Resumo:
A QoS adaptation to dynamically changing system conditions that takes into consideration the user’s constraints on the stability of service provisioning is presented. The goal is to allow the system to make QoS adaptation decisions in response to fluctuations in task traffic flow, under the control of the user. We pay special attention to the case where monitoring the stability period and resource load variation of Service Level Agreements for different types of services is used to dynamically adapt future stability periods, according to a feedback control scheme. System’s adaptation behaviour can be configured according to a desired confidence level on future resource usage. The viability of the proposed approach is validated by preliminary experiments.
Resumo:
Many studies have demonstrated the relationship between the alpha activity and the central visual ability, in which the visual ability is usually assessed through static stimuli. Besides static circumstance, however in the real environment there are often dynamic changes and the peripheral visual ability in a dynamic environment (i.e., dynamic peripheral visual ability) is important for all people. So far, no work has reported whether there is a relationship between the dynamic peripheral visual ability and the alpha activity. Thus, the objective of this study was to investigate their relationship. Sixty-two soccer players performed a newly designed peripheral vision task in which the visual stimuli were dynamic, while their EEG signals were recorded from Cz, O1, and O2 locations. The relationship between the dynamic peripheral visual performance and the alpha activity was examined by the percentage-bend correlation test. The results indicated no significant correlation between the dynamic peripheral visual performance and the alpha amplitudes in the eyes-open and eyes-closed resting condition. However, it was not the case for the alpha activity during the peripheral vision task: the dynamic peripheral visual performance showed significant positive inter-individual correlations with the amplitudes in the alpha band (8-12 Hz) and the individual alpha band (IAB) during the peripheral vision task. A potential application of this finding is to improve the dynamic peripheral visual performance by up-regulating alpha activity using neuromodulation techniques.
Resumo:
The concepts involved with fractional calculus (FC) theory are applied in almost all areas of science and engineering. Its ability to yield superior modeling and control in many dynamical systems is well recognized. In this article, we will introduce the fundamental aspects associated with the application of FC to the control of dynamic systems.
Resumo:
ABSTRACT OBJECTIVE To develop an assessment tool to evaluate the efficiency of federal university general hospitals. METHODS Data envelopment analysis, a linear programming technique, creates a best practice frontier by comparing observed production given the amount of resources used. The model is output-oriented and considers variable returns to scale. Network data envelopment analysis considers link variables belonging to more than one dimension (in the model, medical residents, adjusted admissions, and research projects). Dynamic network data envelopment analysis uses carry-over variables (in the model, financing budget) to analyze frontier shift in subsequent years. Data were gathered from the information system of the Brazilian Ministry of Education (MEC), 2010-2013. RESULTS The mean scores for health care, teaching and research over the period were 58.0%, 86.0%, and 61.0%, respectively. In 2012, the best performance year, for all units to reach the frontier it would be necessary to have a mean increase of 65.0% in outpatient visits; 34.0% in admissions; 12.0% in undergraduate students; 13.0% in multi-professional residents; 48.0% in graduate students; 7.0% in research projects; besides a decrease of 9.0% in medical residents. In the same year, an increase of 0.9% in financing budget would be necessary to improve the care output frontier. In the dynamic evaluation, there was progress in teaching efficiency, oscillation in medical care and no variation in research. CONCLUSIONS The proposed model generates public health planning and programming parameters by estimating efficiency scores and making projections to reach the best practice frontier.
Resumo:
One fundamental idea of service-oriented computing is that applications should be developed by composing already available services. Due to the long running nature of service interactions, a main challenge in service composition is ensuring correctness of transaction recovery. In this paper, we use a process calculus suitable for modelling long running transactions with a recovery mechanism based on compensations. Within this setting, we discuss and formally state correctness criteria for compensable processes compositions, assuming that each process is correct with respect to transaction recovery. Under our theory, we formally interpret self-healing compositions, that can detect and recover from faults, as correct compositions of compensable processes. Moreover, we develop an automated verification approach and we apply it to an illustrative case study.
Resumo:
Thesis for the Degree of Master of Science in Biotechnology Universidade Nova de Lisboa, Faculdade de Ciências e Tecnologia
Resumo:
In this study the inhalation doses and respective risk are calculated for the population living within a 20 km radius of a coal-fired power plant. The dispersion and deposition of natural radionuclides were simulated by a Gaussian dispersion model estimating the ground level activity concentration. The annual effective dose and total risk were 0.03205 mSv/y and 1.25 x 10-8, respectively. The effective dose is lower than the limit established by the ICRP and the risk is lower than the limit proposed by the U.S. EPA, which means that the considered exposure does not pose any risk for the public health.
Resumo:
Trabalho Final de Mestrado para obtenção do grau de Mestre em Engenharia de Electrónica e telecomunicações
Resumo:
15th IEEE International Conference on Electronics, Circuits and Systems, Malta
Resumo:
This work deals with the numerical simulation of air stripping process for the pre-treatment of groundwater used in human consumption. The model established in steady state presents an exponential solution that is used, together with the Tau Method, to get a spectral approach of the solution of the system of partial differential equations associated to the model in transient state.
Resumo:
The aim of this work was to assess the influence of meteorological conditions on the dispersion of particulate matter from an industrial zone into urban and suburban areas. The particulate matter concentration was related to the most important meteorological variables such as wind direction, velocity and frequency. A coal-fired power plant was considered to be the main emission source with two stacks of 225 m height. A middle point between the two stacks was taken as the centre of two concentric circles with 6 and 20 km radius delimiting the sampling area. About 40 sampling collectors were placed within this area. Meteorological data was obtained from a portable meteorological station placed at approximately 1.7 km to SE from the stacks. Additional data was obtained from the electrical company that runs the coal power plant. These data covers the years from 2006 to the present. A detailed statistical analysis was performed to identify the most frequent meteorological conditions concerning mainly wind speed and direction. This analysis revealed that the most frequent wind blows from Northwest and North and the strongest winds blow from Northwest. Particulate matter deposition was obtained in two sampling campaigns carried out in summer and in spring. For the first campaign the monthly average flux deposition was 1.90 g/m2 and for the second campaign this value was 0.79 g/m2. Wind dispersion occurred predominantly from North to South, away from the nearest residential area, located at about 6 km to Northwest from the stacks. Nevertheless, the higher deposition fluxes occurred in the NW/N and NE/E quadrants. This study was conducted considering only the contribution of particulate matter from coal combustion, however, others sources may be present as well, such as road traffic. Additional chemical analyses and microanalysis are needed to identify the source linkage to flux deposition levels.
Resumo:
Dissertação apresentada na Faculdade de Ciências e Tecnologia da Universidade Nova de Lisboa para obtenção do grau de Mestre em Engenharia Informática
Resumo:
The objective of great investments in telecommunication networks is to approach economies and put an end to the asymmetries. The most isolated regions could be the beneficiaries of this new technological investments wave disseminating trough the territories. The new economic scenarios created by globalisation make high capacity backbones and coherent information society polity, two instruments that could change regions fate and launch them in to an economic development context. Technology could bring international projection to services or products and could be the differentiating element between a national and an international economic strategy. So, the networks and its fluxes are becoming two of the most important variables to the economies. Measuring and representing this new informational accessibility, mapping new communities, finding new patterns and localisation models, could be today’s challenge. In the physical and real space, location is defined by two or three geographical co-ordinates. In the network virtual space or in cyberspace, geography seems incapable to define location, because it doesn’t have a good model. Trying to solve the problem and based on geographical theories and concepts, new fields of study came to light. The Internet Geography, Cybergeography or Geography of Cyberspace are only three examples. In this paper and using Internet Geography and informational cartography, it was possible to observe and analyse the spacialisation of the Internet phenomenon trough the distribution of the IP addresses in the Portuguese territory. This work shows the great potential and applicability of this indicator to Internet dissemination and regional development studies. The Portuguese territory is seen in a completely new form: the IP address distribution of Country Code Top Level Domains (.pt) could show new regional hierarchies. The spatial concentration or dispersion of top level domains seems to be a good instrument to reflect the info-structural dynamic and economic development of a territory, especially at regional level.