950 resultados para Computational grids (Computer systems)
Resumo:
This study examines Smart Grids and distributed generation, which is connected to a single-family house. The distributed generation comprises small wind power plant and solar panels. The study is done from the consumer point of view and it is divided into two parts. The first part presents the theoretical part and the second part presents the research part. The theoretical part consists of the definition of distributed generation, wind power, solar energy and Smart Grids. The study examines what the Smart Grids will enable. New technology concerning Smart Grids is also examined. The research part introduces wind and sun conditions from two countries. The countries are Finland and Germany. According to the wind and sun conditions of these two countries, the annual electricity production from wind power plant and solar panels will be calculated. The costs of generating electricity from wind and solar energy are calculated from the results of annual electricity productions. The study will also deal with feed-in tariffs, which are supporting systems for renewable energy resources. It is examined in the study, if it is cost-effective for the consumers to use the produced electricity by themselves or sell it to the grid. Finally, figures for both countries are formed. The figures include the calculated cost of generating electricity from wind power plant and solar panels, retail and wholesale prices and feed-in tariffs. In Finland, it is not cost-effective to sell the produced electricity to the grid, before there are support systems. In Germany, it is cost-effective to sell the produced electricity from solar panels to the grid because of feed-in tariffs. On the other hand, in Germany it is cost-effective to produce electricity from wind to own use because the retail price is higher than the produced electricity from wind.
Resumo:
Major challenges must be tackled for brain-computer interfaces to mature into an established communications medium for VR applications, which will range from basic neuroscience studies to developing optimal peripherals and mental gamepads and more efficient brain-signal processing techniques.
Resumo:
A brain-computer interface (BCI) is a new communication channel between the human brain and a computer. Applications of BCI systems comprise the restoration of movements, communication and environmental control. In this study experiments were made that used the BCI system to control or to navigate in virtual environments (VE) just by thoughts. BCI experiments for navigation in VR were conducted so far with synchronous BCI and asynchronous BCI systems. The synchronous BCI analyzes the EEG patterns in a predefined time window and has 2 to 3 degrees of freedom.
Resumo:
We present in this educational article a theoretical analysis based on DFT/B3LYP 6-311++G (d,p) and ab initio MP2/6-311++G(d,p) computational calculation about the reactivity and the regioselectivity on the chlorination reaction of anisole, toluene and nitrobenzene, using trichloroisocyanuric acid (TICA) as donor of Cl+. The H.O.M.O. / L.U.M.O. energy and N.B.O. atomic charges of various aromatic systems were calculated in ab initio level. The energies of the reagents and intermediaries were calculated using D.F.T.. These results have been presented as a quantitative example for the S E A mechanism, in the undergraduate organic chemistry disciplines.
Resumo:
Intermolecular forces are a useful concept that can explain the attraction between particulate matter as well as numerous phenomena in our lives such as viscosity, solubility, drug interactions, and dyeing of fibers. However, studies show that students have difficulty understanding this important concept, which has led us to develop a free educational software in English and Portuguese. The software can be used interactively by teachers and students, thus facilitating better understanding. Professors and students, both graduate and undergraduate, were questioned about the software quality and its intuitiveness of use, facility of navigation, and pedagogical application using a Likert scale. The results led to the conclusion that the developed computer application can be characterized as an auxiliary tool to assist teachers in their lectures and students in their learning process of intermolecular forces.
Resumo:
Growing recognition of the electricity grid modernization to enable new electricity generation and consumption schemes has found articulation in the vision of the Smart Grid platform. The essence of this vision is an autonomous network with two-way electricity power flows and extensive real-time information between the generation nodes, various electricity-dependent appliances and all points in-between. Three major components of the Smart Grids are distributed intelligence, communication technologies, and automated control systems. The aim of this thesis is to recognize the challenges that Smart Grids are facing, while extinguishing the main driving factors for their introduction. The scope of the thesis also covers possible place of electricity Aggregator Company in the current and future electricity markets. Basic functions of an aggregator and possible revenue sources along with demand response feasibility calculations are reviewed within this thesis.
Resumo:
Broadcasting systems are networks where the transmission is received by several terminals. Generally broadcast receivers are passive devices in the network, meaning that they do not interact with the transmitter. Providing a certain Quality of Service (QoS) for the receivers in heterogeneous reception environment with no feedback is not an easy task. Forward error control coding can be used for protection against transmission errors to enhance the QoS for broadcast services. For good performance in terrestrial wireless networks, diversity should be utilized. The diversity is utilized by application of interleaving together with the forward error correction codes. In this dissertation the design and analysis of forward error control and control signalling for providing QoS in wireless broadcasting systems are studied. Control signaling is used in broadcasting networks to give the receiver necessary information on how to connect to the network itself and how to receive the services that are being transmitted. Usually control signalling is considered to be transmitted through a dedicated path in the systems. Therefore, the relationship of the signaling and service data paths should be considered early in the design phase. Modeling and simulations are used in the case studies of this dissertation to study this relationship. This dissertation begins with a survey on the broadcasting environment and mechanisms for providing QoS therein. Then case studies present analysis and design of such mechanisms in real systems. The mechanisms for providing QoS considering signaling and service data paths and their relationship at the DVB-H link layer are analyzed as the first case study. In particular the performance of different service data decoding mechanisms and optimal signaling transmission parameter selection are presented. The second case study investigates the design of signaling and service data paths for the more modern DVB-T2 physical layer. Furthermore, by comparing the performances of the signaling and service data paths by simulations, configuration guidelines for the DVB-T2 physical layer signaling are given. The presented guidelines can prove useful when configuring DVB-T2 transmission networks. Finally, recommendations for the design of data and signalling paths are given based on findings from the case studies. The requirements for the signaling design should be derived from the requirements for the main services. Generally, these requirements for signaling should be more demanding as the signaling is the enabler for service reception.
Resumo:
The aim of this study was to simulate blood flow in thoracic human aorta and understand the role of flow dynamics in the initialization and localization of atherosclerotic plaque in human thoracic aorta. The blood flow dynamics in idealized and realistic models of human thoracic aorta were numerically simulated in three idealized and two realistic thoracic aorta models. The idealized models of thoracic aorta were reconstructed with measurements available from literature, and the realistic models of thoracic aorta were constructed by image processing Computed Tomographic (CT) images. The CT images were made available by South Karelia Central Hospital in Lappeenranta. The reconstruction of thoracic aorta consisted of operations, such as contrast adjustment, image segmentations, and 3D surface rendering. Additional design operations were performed to make the aorta model compatible for the numerical method based computer code. The image processing and design operations were performed with specialized medical image processing software. Pulsatile pressure and velocity boundary conditions were deployed as inlet boundary conditions. The blood flow was assumed homogeneous and incompressible. The blood was assumed to be a Newtonian fluid. The simulations with idealized models of thoracic aorta were carried out with Finite Element Method based computer code, while the simulations with realistic models of thoracic aorta were carried out with Finite Volume Method based computer code. Simulations were carried out for four cardiac cycles. The distribution of flow, pressure and Wall Shear Stress (WSS) observed during the fourth cardiac cycle were extensively analyzed. The aim of carrying out the simulations with idealized model was to get an estimate of flow dynamics in a realistic aorta model. The motive behind the choice of three aorta models with distinct features was to understand the dependence of flow dynamics on aorta anatomy. Highly disturbed and nonuniform distribution of velocity and WSS was observed in aortic arch, near brachiocephalic, left common artery, and left subclavian artery. On the other hand, the WSS profiles at the roots of branches show significant differences with geometry variation of aorta and branches. The comparison of instantaneous WSS profiles revealed that the model with straight branching arteries had relatively lower WSS compared to that in the aorta model with curved branches. In addition to this, significant differences were observed in the spatial and temporal profiles of WSS, flow, and pressure. The study with idealized model was extended to study blood flow in thoracic aorta under the effects of hypertension and hypotension. One of the idealized aorta models was modified along with the boundary conditions to mimic the thoracic aorta under the effects of hypertension and hypotension. The results of simulations with realistic models extracted from CT scans demonstrated more realistic flow dynamics than that in the idealized models. During systole, the velocity in ascending aorta was skewed towards the outer wall of aortic arch. The flow develops secondary flow patterns as it moves downstream towards aortic arch. Unlike idealized models, the distribution of flow was nonplanar and heavily guided by the artery anatomy. Flow cavitation was observed in the aorta model which was imaged giving longer branches. This could not be properly observed in the model with imaging containing a shorter length for aortic branches. The flow circulation was also observed in the inner wall of the aortic arch. However, during the diastole, the flow profiles were almost flat and regular due the acceleration of flow at the inlet. The flow profiles were weakly turbulent during the flow reversal. The complex flow patterns caused a non-uniform distribution of WSS. High WSS was distributed at the junction of branches and aortic arch. Low WSS was distributed at the proximal part of the junction, while intermedium WSS was distributed in the distal part of the junction. The pulsatile nature of the inflow caused oscillating WSS at the branch entry region and inner curvature of aortic arch. Based on the WSS distribution in the realistic model, one of the aorta models was altered to induce artificial atherosclerotic plaque at the branch entry region and inner curvature of aortic arch. Atherosclerotic plaque causing 50% blockage of lumen was introduced in brachiocephalic artery, common carotid artery, left subclavian artery, and aortic arch. The aim of this part of the study was first to study the effect of stenosis on flow and WSS distribution, understand the effect of shape of atherosclerotic plaque on flow and WSS distribution, and finally to investigate the effect of lumen blockage severity on flow and WSS distributions. The results revealed that the distribution of WSS is significantly affected by plaque with mere 50% stenosis. The asymmetric shape of stenosis causes higher WSS in branching arteries than in the cases with symmetric plaque. The flow dynamics within thoracic aorta models has been extensively studied and reported here. The effects of pressure and arterial anatomy on the flow dynamic were investigated. The distribution of complex flow and WSS is correlated with the localization of atherosclerosis. With the available results we can conclude that the thoracic aorta, with complex anatomy is the most vulnerable artery for the localization and development of atherosclerosis. The flow dynamics and arterial anatomy play a role in the localization of atherosclerosis. The patient specific image based models can be used to diagnose the locations in the aorta vulnerable to the development of arterial diseases such as atherosclerosis.
Resumo:
Linear programming models are effective tools to support initial or periodic planning of agricultural enterprises, requiring, however, technical coefficients that can be determined using computer simulation models. This paper, presented in two parts, deals with the development, application and tests of a methodology and of a computational modeling tool to support planning of irrigated agriculture activities. Part I aimed at the development and application, including sensitivity analysis, of a multiyear linear programming model to optimize the financial return and water use, at farm level for Jaíba irrigation scheme, Minas Gerais State, Brazil, using data on crop irrigation requirement and yield, obtained from previous simulation with MCID model. The linear programming model outputted a crop pattern to which a maximum total net present value of R$ 372,723.00 for the four years period, was obtained. Constraints on monthly water availability, labor, land and production were critical in the optimal solution. In relation to the water use optimization, it was verified that an expressive reductions on the irrigation requirements may be achieved by small reductions on the maximum total net present value.
Resumo:
Techniques of evaluation of risks coming from inherent uncertainties to the agricultural activity should accompany planning studies. The risk analysis should be carried out by risk simulation using techniques as the Monte Carlo method. This study was carried out to develop a computer program so-called P-RISCO for the application of risky simulations on linear programming models, to apply to a case study, as well to test the results comparatively to the @RISK program. In the risk analysis it was observed that the average of the output variable total net present value, U, was considerably lower than the maximum U value obtained from the linear programming model. It was also verified that the enterprise will be front to expressive risk of shortage of water in the month of April, what doesn't happen for the cropping pattern obtained by the minimization of the irrigation requirement in the months of April in the four years. The scenario analysis indicated that the sale price of the passion fruit crop exercises expressive influence on the financial performance of the enterprise. In the comparative analysis it was verified the equivalence of P-RISCO and @RISK programs in the execution of the risk simulation for the considered scenario.
Resumo:
The simulation programs are important tools to analyze the different energetic alternatives, including the use of renewable energy. The objective of this study was to analyze comparatively the different computer tools available for modeling of solar water heaters. Among the main simulation software of solar thermal systems, there are: RETScreen International, EnergyPlus, TRNSYS, SolDesigner, SolarPro, e T*SOL. Among the tools mentioned, only EnergyPlus and RETScreen International are free, but they allow obtaining interesting results when applied together. The first one has a detailed module of energy analysis of solar water heaters, while the second one provides an detailed economic feasibility study and an assessment of emissions of greenhouse gases. RETScreen International and EnergyPlus programs are aimed at a diverse audience, including designers, researchers and energy planners.
Resumo:
Energy efficiency is one of the major objectives which should be achieved in order to implement the limited energy resources of the world in a sustainable way. Since radiative heat transfer is the dominant heat transfer mechanism in most of fossil fuel combustion systems, more accurate insight and models may cause improvement in the energy efficiency of the new designed combustion systems. The radiative properties of combustion gases are highly wavelength dependent. Better models for calculating the radiative properties of combustion gases are highly required in the modeling of large scale industrial combustion systems. With detailed knowledge of spectral radiative properties of gases, the modeling of combustion processes in the different applications can be more accurate. In order to propose a new method for effective non gray modeling of radiative heat transfer in combustion systems, different models for the spectral properties of gases including SNBM, EWBM, and WSGGM have been studied in this research. Using this detailed analysis of different approaches, the thesis presents new methods for gray and non gray radiative heat transfer modeling in homogeneous and inhomogeneous H2O–CO2 mixtures at atmospheric pressure. The proposed method is able to support the modeling of a wide range of combustion systems including the oxy-fired combustion scenario. The new methods are based on implementing some pre-obtained correlations for the total emissivity and band absorption coefficient of H2O–CO2 mixtures in different temperatures, gas compositions, and optical path lengths. They can be easily used within any commercial CFD software for radiative heat transfer modeling resulting in more accurate, simple, and fast calculations. The new methods were successfully used in CFD modeling by applying them to industrial scale backpass channel under oxy-fired conditions. The developed approaches are more accurate compared with other methods; moreover, they can provide complete explanation and detailed analysis of the radiation heat transfer in different systems under different combustion conditions. The methods were verified by applying them to some benchmarks, and they showed a good level of accuracy and computational speed compared to other methods. Furthermore, the implementation of the suggested banded approach in CFD software is very easy and straightforward.
Resumo:
-
Resumo:
Developing software is a difficult and error-prone activity. Furthermore, the complexity of modern computer applications is significant. Hence,an organised approach to software construction is crucial. Stepwise Feature Introduction – created by R.-J. Back – is a development paradigm, in which software is constructed by adding functionality in small increments. The resulting code has an organised, layered structure and can be easily reused. Moreover, the interaction with the users of the software and the correctness concerns are essential elements of the development process, contributing to high quality and functionality of the final product. The paradigm of Stepwise Feature Introduction has been successfully applied in an academic environment, to a number of small-scale developments. The thesis examines the paradigm and its suitability to construction of large and complex software systems by focusing on the development of two software systems of significant complexity. Throughout the thesis we propose a number of improvements and modifications that should be applied to the paradigm when developing or reengineering large and complex software systems. The discussion in the thesis covers various aspects of software development that relate to Stepwise Feature Introduction. More specifically, we evaluate the paradigm based on the common practices of object-oriented programming and design and agile development methodologies. We also outline the strategy to testing systems built with the paradigm of Stepwise Feature Introduction.
Resumo:
Today's networked systems are becoming increasingly complex and diverse. The current simulation and runtime verification techniques do not provide support for developing such systems efficiently; moreover, the reliability of the simulated/verified systems is not thoroughly ensured. To address these challenges, the use of formal techniques to reason about network system development is growing, while at the same time, the mathematical background necessary for using formal techniques is a barrier for network designers to efficiently employ them. Thus, these techniques are not vastly used for developing networked systems. The objective of this thesis is to propose formal approaches for the development of reliable networked systems, by taking efficiency into account. With respect to reliability, we propose the architectural development of correct-by-construction networked system models. With respect to efficiency, we propose reusable network architectures as well as network development. At the core of our development methodology, we employ the abstraction and refinement techniques for the development and analysis of networked systems. We evaluate our proposal by employing the proposed architectures to a pervasive class of dynamic networks, i.e., wireless sensor network architectures as well as to a pervasive class of static networks, i.e., network-on-chip architectures. The ultimate goal of our research is to put forward the idea of building libraries of pre-proved rules for the efficient modelling, development, and analysis of networked systems. We take into account both qualitative and quantitative analysis of networks via varied formal tool support, using a theorem prover the Rodin platform and a statistical model checker the SMC-Uppaal.