22 resultados para Integrated circuits Very large scale integration Design and construction.

em Doria (National Library of Finland DSpace Services) - National Library of Finland, Finland


Relevância:

100.00% 100.00%

Publicador:

Resumo:

Currently, a high penetration level of Distributed Generations (DGs) has been observed in the Danish distribution systems, and even more DGs are foreseen to be present in the upcoming years. How to utilize them for maintaining the security of the power supply under the emergency situations, has been of great interest for study. This master project is intended to develop a control architecture for studying purposes of distribution systems with large scale integration of solar power. As part of the EcoGrid EU Smart Grid project, it focuses on the system modelling and simulation of a Danish representative LV network located in Bornholm island. Regarding the control architecture, two types of reactive control techniques are implemented and compare. In addition, a network voltage control based on a tap changer transformer is tested. The optimized results after applying a genetic algorithm to five typical Danish domestic loads are lower power losses and voltage deviation using Q(U) control, specially with large consumptions. Finally, a communication and information exchange system is developed with the objective of regulating the reactive power and thereby, the network voltage remotely and real-time. Validation test of the simulated parameters are performed as well.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

This thesis contains dynamical analysis on four different scales: the Solar system, the Sun itself, the Solar neighbourhood, and the central region of the Milky Way galaxy. All of these topics have been handled through methods of potential theory and statistics. The central topic of the thesis is the orbits of stars in the Milky Way. An introduction into the general structure of the Milky Way is presented, with an emphasis on the evolution of the observed value for the scale-length of the Milky Way disc and the observations of two separate bars in the Milky Way. The basics of potential theory are also presented, as well as a developed potential model for the Milky Way. An implementation of the backwards restricted integration method is shown, rounding off the basic principles used in the dynamical studies of this thesis. The thesis looks at the orbit of the Sun, and its impact on the Oort cloud comets (Paper IV), showing that there is a clear link between these two dynamical systems. The statistical atypicalness of the orbit of the Sun is questioned (Paper I), concluding that there is some statistical typicalness to the orbit of the Sun, although it is not very significant. This does depend slightly on whether one includes a bar, or not, as a bar has a clear effect on the dynamical features seen in the Solar neighbourhood (Paper III). This method can be used to find the possible properties of a bar. Finally, we look at the effect of a bar on a statistical system in the Milky Way, seeing that there are not only interesting effects depending on the mass and size of the bar, but also how bars can capture disc stars (Paper II).

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Planar, large area, position sensitive silicon detectors are widely utilized in high energy physics research and in medical, computed tomography (CT). This thesis describes author's research work relating to development of such detector components. The key motivation and objective for the research work has been the development of novel, position sensitive detectors improving the performance of the instruments they are intended for. Silicon strip detectors are the key components of barrel-shaped tracking instruments which are typically the innermost structures of high energy physics experimental stations. Particle colliders such as the former LEP collider or present LHC produce particle collisions and the silicon strip detector based trackers locate the trajectories of particles emanating from such collisions. Medical CT has become a regular part of everyday medical care in all developed countries. CT scanning enables x-ray imaging of all parts of the human body with an outstanding structural resolution and contrast. Brain, chest and abdomen slice images with a resolution of 0.5 mm are possible and latest CT machines are able to image whole human heart between heart beats. The two application areas are presented shortly and the radiation detection properties of planar silicon detectors are discussed. Fabrication methods and preamplifier electronics of the planar detectors are presented. Designs of the developed, large area silicon detectors are presented and measurement results of the key operating parameters are discussed. Static and dynamic performance of the developed silicon strip detectors are shown to be very satisfactory for experimental physics applications. Results relating to the developed, novel CT detector chips are found to be very promising for further development and all key performance goals are met.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The global interest towards renewable energy production such as wind and solar energy is increasing, which in turn calls for new energy storage concepts due to the larger share of intermittent energy production. Power-to-gas solutions can be utilized to convert surplus electricity to chemical energy which can be stored for extended periods of time. The energy storage concept explored in this thesis is an integrated energy storage tank connected to an oxy-fuel combustion plant. Using this approach, flue gases from the plant could be fed directly into the storage tank and later converted into synthetic natural gas by utilizing electrolysis-methanation route. This work utilizes computational fluid dynamics to model the desublimation of carbon dioxide inside a storage tank containing cryogenic liquid, such as liquefied natural gas. Numerical modelling enables the evaluation of the transient flow patterns caused by the desublimation, as well as general fluid behaviour inside the tank. Based on simulations the stability of the cryogenic storage and the magnitude of the key parameters can be evaluated.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

This work focuses on the 159.5 kW solar photovoltaic power plant project installed at the Lappeenranta University of Technology in 2013 as an example of what a solar plant project could be in Finland. The project consists of a two row carport and a flat roof installation on the roof of the university laboratories. The purpose of this project is not only its obvious energy savings potential but also to serve as research and teaching laboratory tool. By 2013, there were not many large scale solar power plants in Finland. For this reason, the installation and data experience from the solar power plant at LUT has brought valuable information for similar projects in northern countries. This work includes a first part for the design and acquisition of the project to continue explaining about the components and their installation. At the end, energy produced by this solar power plant is studied and calculated to find out some relevant economical results. For this, the radiation arriving to southern Finland, the losses of the system in cold weather and the impact of snow among other aspects are taken into account.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

SD card (Secure Digital Memory Card) is widely used in portable storage medium. Currently, latest researches on SD card, are mainly SD card controller based on FPGA (Field Programmable Gate Array). Most of them are relying on API interface (Application Programming Interface), AHB bus (Advanced High performance Bus), etc. They are dedicated to the realization of ultra high speed communication between SD card and upper systems. Studies about SD card controller, really play a vital role in the field of high speed cameras and other sub-areas of expertise. This design of FPGA-based file systems and SD2.0 IP (Intellectual Property core) does not only exhibit a nice transmission rate, but also achieve the systematic management of files, while retaining a strong portability and practicality. The file system design and implementation on a SD card covers the main three IP innovation points. First, the combination and integration of file system and SD card controller, makes the overall system highly integrated and practical. The popular SD2.0 protocol is implemented for communication channels. Pure digital logic design based on VHDL (Very-High-Speed Integrated Circuit Hardware Description Language), integrates the SD card controller in hardware layer and the FAT32 file system for the entire system. Secondly, the document management system mechanism makes document processing more convenient and easy. Especially for small files in batch processing, it can ease the pressure of upper system to frequently access and process them, thereby enhancing the overall efficiency of systems. Finally, digital design ensures the superior performance. For transmission security, CRC (Cyclic Redundancy Check) algorithm is for data transmission protection. Design of each module is platform-independent of macro cells, and keeps a better portability. Custom integrated instructions and interfaces may facilitate easily to use. Finally, the actual test went through multi-platform method, Xilinx and Altera FPGA developing platforms. The timing simulation and debugging of each module was covered. Finally, Test results show that the designed FPGA-based file system IP on SD card can support SD card, TF card and Micro SD with 2.0 protocols, and the successful implementation of systematic management for stored files, and supports SD bus mode. Data read and write rates in Kingston class10 card is approximately 24.27MB/s and 16.94MB/s.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

In the theoretical part, the different polymerisation catalysts are introduced and the phenomena related to mixing in the stirred tank reactor are presented. Also the advantages and challenges related to scale-up are discussed. The aim of the applied part was to design and implement an intermediate-sized reactor useful for scale-up studies. The reactor setting was tested making one batch of Ziegler–Natta polypropylene catalyst. The catalyst preparation with a designed equipment setting succeeded and the catalyst was analysed. The analyses of the catalyst were done, because the properties of the catalyst were compared to the normal properties of Ziegler–Natta polypropylene catalyst. The total titanium content of the catalyst was slightly higher than in normal Ziegler–Natta polypropylene catalyst, but the magnesium and aluminium content of the catalyst were in the normal level. By adjusting the siphonation tube and adding one washing step the titanium content of the catalyst could be decreased. The particle size of the catalyst was small, but the activity was in a normal range. The size of the catalyst particles could be increased by decreasing the stirring speed. During the test run, it was noticed that some improvements for the designed equipment setting could be done. For example more valves for the chemical feed line need to be added to ensure inert conditions during the catalyst preparation. Also nitrogen for the reactor needs to separate from other nitrogen line. With this change the pressure in the reactor can be kept as desired during the catalyst preparation. The proposals for improvements are presented in the applied part. After these improvements are done, the equipment setting is ready for start-up. The computational fluid dynamics model for the designed reactor was provided by cooperation with Lappeenranta University of Technology. The experiments showed that for adequate mixing with one impeller, stirring speed of 600 rpm is needed. The computational fluid dynamics model with two impellers showed that there was no difference in the mixing efficiency if the upper impeller were pumping downwards or upwards.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Ore sorting after crushing is an effective way to enhance the feed quality of a concentrator. Sorting by hand is the oldest way of concentrating minerals but it has become outdated because of low capacities. Older methods of sorting have also been difficult to use in large scale productions due to low capacities of sorters. Data transfer and processing and the speed of rejection mechanisms have been the bottlenecks for effective use of sorters. A fictive chalcopyrite ore body was created for this thesis. The properties of the ore were typical of chalcopyrite ores and economical limit was set for design. Concentrator capacity was determined by the size of ore body and the planned mine life. Two concentrator scenarios were compared, one with the sorting facility and the other without sorting. Comparison was made for quality and amount of feed, size of equipment and economics. Concentrator with sorting had lower investment and operational cost but also lower incomes due to the ore loss in sorting. Net cash flow, net present value and internal rate of interest were calculated for comparison of the two scenarios.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Frontier and Emerging economies have implemented policies with the objective of liberalizing their equity markets. Equity market liberalization opens the domestic equity market to foreign investors and as well paves the way for domestic investors to invest in foreign equity securities. Among other things, equity market liberalization results in diversification benefits. Moreover, equity market liberalization leads to low cost of equity capital resulting from the lower rate of return by investors. Additionally, foreign and local investors share any potential risks. Liberalized equity markets also become liquid considering that there are more investors to trade. Equity market liberalization results in financial integration which explains the movement of two markets. In crisis period, increased volatility and co-movement between two markets may result in what is termed contagion effects. In Africa, major moves toward financial liberalization generally started in the late 1980s with South Africa as the pioneer. Over the years, researchers have studied the impact of financial liberalization on Africa’s economic development with diverse results; some being positive, others negative and still others being mixed. The objective of this study is to establish whether African stock-markets are integrated into the United States (US) and World market. Furthermore, the study helps to see if there are international linkages between the Africa, US and the world markets. A Bivariate- VAR- GARCH- BEKK model is employed in the study. In the study, the effect of thin trading is removed through series of econometric data purification. This is because thin trading, also known as non-trading or inconsistency of trading, is a main feature of African markets and may trigger inconsistency and biased results. The study confirmed the widely established results that the South Africa and Egypt stock markets are highly integrated with the US and World market. Interestingly, the study adds to knowledge in this research area by establishing the fact that Kenya is very integrated with the US and World markets and that it receives and exports past innovations as well as shocks to and from the US and World market.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Presentation at Open Repositories 2014, Helsinki, Finland, June 9-13, 2014

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Presentation at Open Repositories 2014, Helsinki, Finland, June 9-13, 2014

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Presentation at Open Repositories 2014, Helsinki, Finland, June 9-13, 2014

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The overwhelming amount and unprecedented speed of publication in the biomedical domain make it difficult for life science researchers to acquire and maintain a broad view of the field and gather all information that would be relevant for their research. As a response to this problem, the BioNLP (Biomedical Natural Language Processing) community of researches has emerged and strives to assist life science researchers by developing modern natural language processing (NLP), information extraction (IE) and information retrieval (IR) methods that can be applied at large-scale, to scan the whole publicly available biomedical literature and extract and aggregate the information found within, while automatically normalizing the variability of natural language statements. Among different tasks, biomedical event extraction has received much attention within BioNLP community recently. Biomedical event extraction constitutes the identification of biological processes and interactions described in biomedical literature, and their representation as a set of recursive event structures. The 2009–2013 series of BioNLP Shared Tasks on Event Extraction have given raise to a number of event extraction systems, several of which have been applied at a large scale (the full set of PubMed abstracts and PubMed Central Open Access full text articles), leading to creation of massive biomedical event databases, each of which containing millions of events. Sinece top-ranking event extraction systems are based on machine-learning approach and are trained on the narrow-domain, carefully selected Shared Task training data, their performance drops when being faced with the topically highly varied PubMed and PubMed Central documents. Specifically, false-positive predictions by these systems lead to generation of incorrect biomolecular events which are spotted by the end-users. This thesis proposes a novel post-processing approach, utilizing a combination of supervised and unsupervised learning techniques, that can automatically identify and filter out a considerable proportion of incorrect events from large-scale event databases, thus increasing the general credibility of those databases. The second part of this thesis is dedicated to a system we developed for hypothesis generation from large-scale event databases, which is able to discover novel biomolecular interactions among genes/gene-products. We cast the hypothesis generation problem as a supervised network topology prediction, i.e predicting new edges in the network, as well as types and directions for these edges, utilizing a set of features that can be extracted from large biomedical event networks. Routine machine learning evaluation results, as well as manual evaluation results suggest that the problem is indeed learnable. This work won the Best Paper Award in The 5th International Symposium on Languages in Biology and Medicine (LBM 2013).

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The need for high performance, high precision, and energy saving in rotating machinery demands an alternative solution to traditional bearings. Because of the contactless operation principle, the rotating machines employing active magnetic bearings (AMBs) provide many advantages over the traditional ones. The advantages such as contamination-free operation, low maintenance costs, high rotational speeds, low parasitic losses, programmable stiffness and damping, and vibration insulation come at expense of high cost, and complex technical solution. All these properties make the use of AMBs appropriate primarily for specific and highly demanding applications. High performance and high precision control requires model-based control methods and accurate models of the flexible rotor. In turn, complex models lead to high-order controllers and feature considerable computational burden. Fortunately, in the last few years the advancements in signal processing devices provide new perspective on the real-time control of AMBs. The design and the real-time digital implementation of the high-order LQ controllers, which focus on fast execution times, are the subjects of this work. In particular, the control design and implementation in the field programmable gate array (FPGA) circuits are investigated. The optimal design is guided by the physical constraints of the system for selecting the optimal weighting matrices. The plant model is complemented by augmenting appropriate disturbance models. The compensation of the force-field nonlinearities is proposed for decreasing the uncertainty of the actuator. A disturbance-observer-based unbalance compensation for canceling the magnetic force vibrations or vibrations in the measured positions is presented. The theoretical studies are verified by the practical experiments utilizing a custom-built laboratory test rig. The test rig uses a prototyping control platform developed in the scope of this work. To sum up, the work makes a step in the direction of an embedded single-chip FPGA-based controller of AMBs.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Terveydenhuollossa käytetään nykyisin informaatioteknologian (IT) mahdollisuuksia parantamaan hoidon laatua, vähentämään hoitoon liittyviä kuluja sekä yksinkertaistamaan ja selkeyttämään laakareiden työnkulkua. Tietojärjestelmät, jotka edustavat jokaisen IT-ratkaisun ydintä, täytyy kehittää täyttämään lukuisia vaatimuksia, ja yksi niistä on kyky integroitua saumattomasti toisten tietojärjestelmien kanssa. Järjestelmäintegraatio on kuitenkin yhä haastava tehtävä, vaikka sita varten on kehitetty useita standardeja. Tässä työssä kuvataan vastakehitetyn lääketieteellisen tietojärjestelmän liittymäratkaisu. Työssä pohditaan vaatimuksia, jotka tällaiselle sovellukselle asetetaan, ja myös tapa, jolla vaatimukset toteutuvat on esitetty. Liittymaratkaisu on jaettu kahteen osaan, tietojärjestelmaliittymään ja "liittymakoneeseen" (interfacing engine). Edellinen on käsittää perustoiminnallisuuden, jota tarvitaan vastaanottamaan ja lähettämään tietoa toisiin järjestelmiin, kun taas jälkimmäinen tarjoaa tuen tuotantoympäristössa käytettäville standardeille. Molempien osien suunnitelu on esitelty perusteellisesti tässä työssä. Ongelma ratkaistiin modulaarisen ja geneerisen suunnittelun avulla. Tämä lähestymistapa osoitetaan työssä kestäväksi ja joustavaksi ratkaisuksi, jota voidaan käyttää tarkastelemaan laajaa valikoimaa liittymäratkaisulle asetettuja vaatimuksia. Lisaksi osoitetaan kuinka tehty ratkaisu voidaan joustavuutensa ansiosta helposti mukauttaa vaatimuksiin, joita ei ole etukäteen tunnistettu, ja siten saavutetaan perusta myös tulevaisuuden tarpeille