965 resultados para Speaker verification


Relevância:

10.00% 10.00%

Publicador:

Resumo:

This thesis presents an approach for formulating and validating a space averaged drag model for coarse mesh simulations of gas-solid flows in fluidized beds using the two-fluid model. Proper modeling for fluid dynamics is central in understanding any industrial multiphase flow. The gas-solid flows in fluidized beds are heterogeneous and usually simulated with the Eulerian description of phases. Such a description requires the usage of fine meshes and small time steps for the proper prediction of its hydrodynamics. Such constraint on the mesh and time step size results in a large number of control volumes and long computational times which are unaffordable for simulations of large scale fluidized beds. If proper closure models are not included, coarse mesh simulations for fluidized beds do not give reasonable results. The coarse mesh simulation fails to resolve the mesoscale structures and results in uniform solids concentration profiles. For a circulating fluidized bed riser, such predicted profiles result in a higher drag force between the gas and solid phase and also overestimated solids mass flux at the outlet. Thus, there is a need to formulate the closure correlations which can accurately predict the hydrodynamics using coarse meshes. This thesis uses the space averaging modeling approach in the formulation of closure models for coarse mesh simulations of the gas-solid flow in fluidized beds using Geldart group B particles. In the analysis of formulating the closure correlation for space averaged drag model, the main parameters for the modeling were found to be the averaging size, solid volume fraction, and distance from the wall. The closure model for the gas-solid drag force was formulated and validated for coarse mesh simulations of the riser, which showed the verification of this modeling approach. Coarse mesh simulations using the corrected drag model resulted in lowered values of solids mass flux. Such an approach is a promising tool in the formulation of appropriate closure models which can be used in coarse mesh simulations of large scale fluidized beds.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Ikääntyvien ihmisten kasvava määrä tulevina vuosikymmeninä kuormittaa kaupunkien kotihoitoa enenemässä määrin. Kaupunkien rajalliset resurssit ovat jo nyt koetuksella, eikä nykyiseen tilanteeseen ole nähtävissä huomattavaa parannusta tulevina vuosina. Kotihoidon henkilöstön määrää ei pystytä kasvattamaan riittävästi suhteessa kasvavien asiakasvirtojen kanssa, jotta korkea palvelun laatu voitaisiin taata myös tulevaisuudessa. Lahden kaupungin kotihoito pyrkii etsimään teknisiä ratkaisuja kotihoidon haasteisiin muun muassa kotihoidon töiden jakamiseen kehitetyllä optimointialgoritmilla sekä simuloinnilla. Tämä diplomityö käsittelee toimintatutkimuksen avulla simuloinnin tuomia hyötyjä sekä rajoitteita Lahden kotihoidon näkökulmasta. Launeen alueen kotihoidon haasteita käydään läpi neljässä eri työpajassa. Työssä esitetään Quest-simulointiohjelmiston ominaisuuksia, sekä Launeen alueen simulointimallin luomista aina suunnittelusta verifiointiin. Työn tuottama lisäarvo kotihoidon kehittämisessä tulee ilmi neljässä eri vaihtoehtoajossa kotihoitajien asiakkaalta toiselle kulkemien matkojen ja matkoihin käytettyjen aikojen mittaamisessa.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Commercially available haptic interfaces are usable for many purposes. However, as generic devices they are not the most suitable for the control of heavy duty mobile working machines like mining machines, container handling equipment and excavators. Alternative mechanical constructions for a haptic controller are presented and analysed. A virtual reality environment (VRE) was built to test the proposed haptic controller mechanisms. Verification of an electric motor emulating a hydraulic pump in the electro-hydraulic system of a mobile working machine is carried out. A real-time simulator using multi-body-dynamics based software with hardware-in-loop (HIL) setup was used for the tests. Recommendations for further development of a haptic controller and emulator electric motor are given.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Today's networked systems are becoming increasingly complex and diverse. The current simulation and runtime verification techniques do not provide support for developing such systems efficiently; moreover, the reliability of the simulated/verified systems is not thoroughly ensured. To address these challenges, the use of formal techniques to reason about network system development is growing, while at the same time, the mathematical background necessary for using formal techniques is a barrier for network designers to efficiently employ them. Thus, these techniques are not vastly used for developing networked systems. The objective of this thesis is to propose formal approaches for the development of reliable networked systems, by taking efficiency into account. With respect to reliability, we propose the architectural development of correct-by-construction networked system models. With respect to efficiency, we propose reusable network architectures as well as network development. At the core of our development methodology, we employ the abstraction and refinement techniques for the development and analysis of networked systems. We evaluate our proposal by employing the proposed architectures to a pervasive class of dynamic networks, i.e., wireless sensor network architectures as well as to a pervasive class of static networks, i.e., network-on-chip architectures. The ultimate goal of our research is to put forward the idea of building libraries of pre-proved rules for the efficient modelling, development, and analysis of networked systems. We take into account both qualitative and quantitative analysis of networks via varied formal tool support, using a theorem prover the Rodin platform and a statistical model checker the SMC-Uppaal.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Abstract: The VHS and CTR were assessed using computerized thoracic radiographs of ten clinically healthy tufted capuchin monkeys (five males and five females) from the Wild Animal Screening Center in São Luís (Centro de Triagem de Animais Silvestres de São Luís-MA-CETAS). Radiographs were taken in laterolateral and dorsoventral projections to calculate the cardiothoracic ratio (VHS) and vertebral heart size (CTR). The VHS showed mean values of 9.34±0.32v (males) and 9.16±0.34v (females) and there was no statistical difference between males and females (p>0.05). The CTR showed mean values of 0.55±0.04 (males) and 0.52±0.03 (females) and there was no statistical difference between the sexes (p>0.05). There was positive correlation between VHS and CTR (r=0.78). The thoracic and heart diameters showed mean values of 5.70±0.48cm and 2.16±0.40cm in the males, respectively. In the females they measured 5.32±0.39cm and 2.94±0.32cm. There was no statistical difference between the sexes. Our results show that the high correlation found between VHS and CTR permitted the verification with similar clinical precision between the two methods to estimate alterations in the heart silhouette by radiographic examination of tufted capuchin, making it an easy technique to apply that can be considered in the investigation of heart problems for this wild species.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Design of flight control laws, verification of performance predictions, and the implementation of flight simulations are tasks that require a mathematical model of the aircraft dynamics. The dynamical models are characterized by coefficients (aerodynamic derivatives) whose values must be determined from flight tests. This work outlines the use of the Extended Kalman Filter (EKF) in obtaining the aerodynamic derivatives of an aircraft. The EKF shows several advantages over the more traditional least-square method (LS). Among these the most important are: there are no restrictions on linearity or in the form which the parameters appears in the mathematical model describing the system, and it is not required that these parameters be time invariant. The EKF uses the statistical properties of the process and the observation noise, to produce estimates based on the mean square error of the estimates themselves. Differently, the LS minimizes a cost function based on the plant output behavior. Results for the estimation of some longitudinal aerodynamic derivatives from simulated data are presented.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

By so far, scholars have discussed how the characteristics of consumer co-operatives (cooperative principles, values and the dual role of members as the users and owners) can potentially give them a competitive advantage over investor-owned firms (IOFs). In addition, concern for the community (as partly derived from locality and regionality) has been seen as a potential source of success for consumer co-operatives. On the other hand, the geographicbound purpose of consumer co-operation causes that consumer co-operative can be regarded as a challenging company form to manage. This is because, according to the purpose of consumer co-operation, co-operatives are obligated to 1) provide the owners with services and goods that are needed and do so at more affordable prices than their competitors do and/or 2) to operate in areas in which competitors do not want to operate (for example, because of the low profitability in certain area of business or region). Thus, consumer co-operatives have to operate very efficiently in order to execute this geographic-bound corporate purpose (e.g. they cannot withdraw from the competition during the declining stages of business). However, this efficiency cannot be achieved by any means; as the acceptance from the important regional stakeholders is the basic operational precondition and lifeline in the long run. Thereby, the central question for the survival and success of consumer co-operatives is; how should the consumer co-operatives execute its corporate purpose so it can be the best alternative to its members in the long run? This question has remained unanswered and lack empirical evidence in the previous studies on the strategic management of consumer cooperation. In more detail, scholars have not yet empirically investigated the question: How can consumer co-operatives use financial and social capital to achieve a sustained competitive advantage? It is this research gap that this doctoral dissertation aims to fulfil. This doctoral dissertation aims to answer the above questions by combining and utilizing interview data from S Group co-operatives and the central organizations in S Group´s network (overall, 33 interviews were gathered), archival material and 56 published media articles/reports. The study is based on a qualitative case study approach that is aimed at theory development, not theory verification (as the theory is considered as nascent in this field of study). Firstly, the findings of this study indicate that consumer co-operatives accumulate financial capital; 1) by making profit (to invest and grow) and 2) by utilizing a network-based organizational structure (local supply chain economies). As a result of financial capital accumulation, consumer co-operatives are able to achieve efficiency gains but also remain local. In addition, a strong financial capital base increases consumer co-operatives´ independence, competitiveness and their ability to participate in regional development (which is in accordance with their geographically bound corporate purpose). Secondly, consumer cooperatives accumulate social capital through informal networking (with important regional stakeholders), corporate social responsibility (CSR) behaviour and CSR reporting, pursuing common good, and interacting and identity sharing. As a result of social capital accumulation, consumer co-operatives are able to obtain the resources for managing; 1) institutional dependencies and 2) customer relations. By accumulating both social and financial capital through the above presented actions, consumer co-operatives are able to achieve sustained competitive advantage. Finally, this thesis provides useful ideas and new knowledge for cooperative managers concerning why and how consumer co-operatives should accumulate financial and social capital (to achieve sustained competitive advantage), while aligning with their corporate purpose.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Parin viime vuosikymmenen aikana on kehitetty huomattavasti entistä lujempia teräslaatuja, joiden käyttö ei kuitenkaan ole yleistynyt läheskään samaan tahtiin. Korkeamman hinnan lisäksi yksi merkittävä syy tähän on, että suunnittelijoilla ei usein ole riittäviä tietoja siitä, millaisissa tilanteissa lujemman teräslaadun käytöstä on merkittävää hyötyä. Tilannetta ei myöskään helpota se, että käytössä olevat standardit eivät tarjoa lainkaan ohjeistusta kaikkein lujimpien, myötörajaltaan yli 700MPa terästen käyttöön ja mitoitukseen. Tässä työssä pyritään tarjoamaan suunnittelijalle ohjeita ja nyrkkisääntöjä sopivan lujuusluokan ja profiilin valintaan sekä yleisesti lujempien teräslaatujen käyttöön. Lujemman teräslaadun käytöllä voidaan keventää suunniteltavaa rakennetta ja saada aikaan huomattavia painonsäästöjä. Usein ongelmaksi nousevat kuitenkin stabiiliuskriteerit, sillä teräksen lommahduskestävyys määräytyy suuresti sen lujuusluokasta siten, että mitä lujempaa teräs on, sitä helpommin se lommahtaa. Kun tämä yhdistetään siihen, että lujempaa terästä käytettäessä rakenteesta tulee optimoituna muutenkin pienempi ja kevyempi, kasvaa näiden kahden asian yhteisvaikutuksena kantokyvyn mukaan mitoitetun rakenteen taipuma korkeampiin lujuusluokkiin edetessä hyvin nopeasti sallittujen rajojen yli. Työssä etsitään siksi keinoja sopivan kompromissin löytämiseksi lujuuden ja jäykkyyden välille. Koska muotoilulla ja poikkileikkauksella on suuri merkitys sekä taipuman että stabiliteetin kannalta, tutkitaan erilaisia poikkileikkausvaihtoehtoja ja etsitään optimaalista poikkileikkausta taivutuspalkille matemaattisen optimointimallin avulla. Kun eri poikkileikkausvaihtoehdot on käsitelty ja optimoitu taivutuksen suhteen, tutkitaan poikkileikkauksia myös muissa kuormitustapauksissa. Huomattavan raskaan laskentatyön takia apuna käytetään Matlab-ohjelmistoa itse optimointiin ja Femap-ohjelmaa muiden kuormitustapausten tutkimiseen ja tulosten verifioitiin.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Negative refractive index materials and propagation of electromagnetic waves in them started to draw attention of scientists not so long ago. This review highlights historically important and recent papers on practical and theoretical aspects related to these issues. Namely, basic properties and peculiarities of such materials related to both their design and wave propagation in them, experimental verification of predictions theoretically made for them, possible practical applications and prospects in this area are considered.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Conventional diagnostics tests and technologies typically allow only a single analysis and result per test. The aim of this study was to propose robust and multiplex array-inwell test platforms based on oligonucleotide and protein arrays combining the advantages of simple instrumentation and upconverting phosphor (UCP) reporter technology. The UCPs are luminescent lanthanide-doped crystals that have a unique capability to convert infrared radiation into visible light. No autofluorescence is produced from the sample under infrared excitation enabling the development of highly sensitive assays. In this study, an oligonucleotide array-in-well hybridization assay was developed for the detection and genotyping of human adenoviruses. The study provided a verification of the advantages and potential of the UCP-based reporter technology in multiplex assays as well as anti-Stokes photoluminescence detection with a new anti- Stokes photoluminescence imager. The developed assay was technically improved and used to detect and genotype adenovirus types from clinical specimens. Based on the results of the epidemiological study, an outbreak of adenovirus type B03 was observed in the autumn of 2010. A quantitative array-in-well immunoassay was developed for three target analytes (prostate specific antigen, thyroid stimulating hormone, and luteinizing hormone). In this study, quantitative results were obtained for each analyte and the analytical sensitivities in buffer were in clinically relevant range. Another protein-based array-inwell assay was developed for multiplex serodiagnostics. The developed assay was able to detect parvovirus B19 IgG and adenovirus IgG antibodies simultaneously from serum samples according to reference assays. The study demonstrated that the UCPtechnology is a robust detection method for diverse multiplex imaging-based array-inwell assays.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

A web service is a software system that provides a machine-processable interface to the other machines over the network using different Internet protocols. They are being increasingly used in the industry in order to automate different tasks and offer services to a wider audience. The REST architectural style aims at producing scalable and extensible web services using technologies that play well with the existing tools and infrastructure of the web. It provides a uniform set of operation that can be used to invoke a CRUD interface (create, retrieve, update and delete) of a web service. The stateless behavior of the service interface requires that every request to a resource is independent of the previous ones facilitating scalability. Automated systems, e.g., hotel reservation systems, provide advanced scenarios for stateful services that require a certain sequence of requests that must be followed in order to fulfill the service goals. Designing and developing such services for advanced scenarios with REST constraints require rigorous approaches that are capable of creating web services that can be trusted for their behavior. Systems that can be trusted for their behavior can be termed as dependable systems. This thesis presents an integrated design, analysis and validation approach that facilitates the service developer to create dependable and stateful REST web services. The main contribution of this thesis is that we provide a novel model-driven methodology to design behavioral REST web service interfaces and their compositions. The behavioral interfaces provide information on what methods can be invoked on a service and the pre- and post-conditions of these methods. The methodology uses Unified Modeling Language (UML), as the modeling language, which has a wide user base and has mature tools that are continuously evolving. We have used UML class diagram and UML state machine diagram with additional design constraints to provide resource and behavioral models, respectively, for designing REST web service interfaces. These service design models serve as a specification document and the information presented in them have manifold applications. The service design models also contain information about the time and domain requirements of the service that can help in requirement traceability which is an important part of our approach. Requirement traceability helps in capturing faults in the design models and other elements of software development environment by tracing back and forth the unfulfilled requirements of the service. The information about service actors is also included in the design models which is required for authenticating the service requests by authorized actors since not all types of users have access to all the resources. In addition, following our design approach, the service developer can ensure that the designed web service interfaces will be REST compliant. The second contribution of this thesis is consistency analysis of the behavioral REST interfaces. To overcome the inconsistency problem and design errors in our service models, we have used semantic technologies. The REST interfaces are represented in web ontology language, OWL2, that can be part of the semantic web. These interfaces are used with OWL 2 reasoners to check unsatisfiable concepts which result in implementations that fail. This work is fully automated thanks to the implemented translation tool and the existing OWL 2 reasoners. The third contribution of this thesis is the verification and validation of REST web services. We have used model checking techniques with UPPAAL model checker for this purpose. The timed automata of UML based service design models are generated with our transformation tool that are verified for their basic characteristics like deadlock freedom, liveness, reachability and safety. The implementation of a web service is tested using a black-box testing approach. Test cases are generated from the UPPAAL timed automata and using the online testing tool, UPPAAL TRON, the service implementation is validated at runtime against its specifications. Requirement traceability is also addressed in our validation approach with which we can see what service goals are met and trace back the unfulfilled service goals to detect the faults in the design models. A final contribution of the thesis is an implementation of behavioral REST interfaces and service monitors from the service design models. The partial code generation tool creates code skeletons of REST web services with method pre and post-conditions. The preconditions of methods constrain the user to invoke the stateful REST service under the right conditions and the post condition constraint the service developer to implement the right functionality. The details of the methods can be manually inserted by the developer as required. We do not target complete automation because we focus only on the interface aspects of the web service. The applicability of the approach is demonstrated with a pedagogical example of a hotel room booking service and a relatively complex worked example of holiday booking service taken from the industrial context. The former example presents a simple explanation of the approach and the later worked example shows how stateful and timed web services offering complex scenarios and involving other web services can be constructed using our approach.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

In this master’s thesis, wind speeds and directions were modeled with the aim of developing suitable models for hourly, daily, weekly and monthly forecasting. Artificial Neural Networks implemented in MATLAB software were used to perform the forecasts. Three main types of artificial neural network were built, namely: Feed forward neural networks, Jordan Elman neural networks and Cascade forward neural networks. Four sub models of each of these neural networks were also built, corresponding to the four forecast horizons, for both wind speeds and directions. A single neural network topology was used for each of the forecast horizons, regardless of the model type. All the models were then trained with real data of wind speeds and directions collected over a period of two years in the municipal region of Puumala in Finland. Only 70% of the data was used for training, validation and testing of the models, while the second last 15% of the data was presented to the trained models for verification. The model outputs were then compared to the last 15% of the original data, by measuring the mean square errors and sum square errors between them. Based on the results, the feed forward networks returned the lowest generalization errors for hourly, weekly and monthly forecasts of wind speeds; Jordan Elman networks returned the lowest errors when used for forecasting of daily wind speeds. Cascade forward networks gave the lowest errors when used for forecasting daily, weekly and monthly wind directions; Jordan Elman networks returned the lowest errors when used for hourly forecasting. The errors were relatively low during training of the models, but shot up upon simulation with new inputs. In addition, a combination of hyperbolic tangent transfer functions for both hidden and output layers returned better results compared to other combinations of transfer functions. In general, wind speeds were more predictable as compared to wind directions, opening up opportunities for further research into building better models for wind direction forecasting.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

This thesis aims to find an effective way of conducting a target audience analysis (TAA) in cyber domain. There are two main focal points that are addressed; the nature of the cyber domain and the method of the TAA. Of the cyber domain the object is to find the opportunities, restrictions and caveats that result from its digital and temporal nature. This is the environment in which the TAA method is examined in this study. As the TAA is an important step of any psychological operation and critical to its success, the method used must cover all the main aspects affecting the choice of a proper target audience. The first part of the research was done by sending an open-ended questionnaire to operators in the field of information warfare both in Finland and abroad. As the results were inconclusive, the research was completed by assessing the applicability of United States Army Joint Publication FM 3-05.301 in the cyber domain via a theory-based content analysis. FM 3- 05.301 was chosen because it presents a complete method of the TAA process. The findings were tested against the results of the questionnaire and new scientific research in the field of psychology. The cyber domain was found to be “fast and vast”, volatile and uncontrollable. Although governed by laws to some extent, the cyber domain is unpredictable by nature and not controllable to reasonable amount. The anonymity and lack of verification often present in the digital channels mean that anyone can have an opinion, and any message sent may change or even be counterproductive to the original purpose. The TAA method of the FM 3-05.301 is applicable in the cyber domain, although some parts of the method are outdated and thus suggested to be updated if used in that environment. The target audience categories of step two of the process were replaced by new groups that exist in the digital environment. The accessibility assessment (step eight) was also redefined, as in the digital media the mere existence of a written text is typically not enough to convey the intended message to the target audience. The scientific studies made in computer sciences and both in psychology and sociology about the behavior of people in social media (and overall in cyber domain) call for a more extensive remake of the TAA process. This falls, however, out of the scope of this work. It is thus suggested that further research should be carried out in search of computer-assisted methods and a more thorough TAA process, utilizing the latest discoveries of human behavior. ---------------------------------------------------------------------------------------------------------------------------------- Tämän opinnäytetyön tavoitteena on löytää tehokas tapa kohdeyleisöanalyysin tekemiseksi kybertoimintaympäristössä. Työssä keskitytään kahteen ilmiöön: kybertoimintaympäristön luonteeseen ja kohdeyleisöanalyysin metodiin. Kybertoimintaympäristön osalta tavoitteena on löytää sen digitaalisesta ja ajallisesta luonteesta juontuvat mahdollisuudet, rajoitteet ja sudenkuopat. Tämä on se ympäristö jossa kohdeyleisöanalyysiä tarkastellaan tässä työssä. Koska kohdeyleisöanalyysi kuuluu olennaisena osana jokaiseen psykologiseen operaatioon ja on onnistumisen kannalta kriittinen tekijä, käytettävän metodin tulee pitää sisällään kaikki oikean kohdeyleisön valinnan kannalta merkittävät osa-alueet. Tutkimuksen ensimmäisessä vaiheessa lähetettiin avoin kysely informaatiosodankäynnin ammattilaisille Suomessa ja ulkomailla. Koska kyselyn tulokset eivät olleet riittäviä johtopäätösten tekemiseksi, tutkimusta jatkettiin tarkastelemalla Yhdysvaltojen armeijan kenttäohjesäännön FM 3-05.301 soveltuvuutta kybertoimintaympäristössä käytettäväksi teorialähtöisen sisällönanalyysin avulla. FM 3-05.301 valittiin koska se sisältää kokonaisvaltaisen kohdeyleisöanalyysiprosessin. Havaintoja verrattiin kyselytutkimuksen tuloksiin ja psykologian uusiin tutkimuksiin. Kybertoimintaympäristö on tulosten perusteella nopea ja valtava, jatkuvasti muuttuva ja kontrolloimaton. Vaikkakin lait hallitsevat kybertoimintaympäristöä jossakin määrin, on se silti luonteeltaan ennakoimaton eikä sitä voida luotettavasti hallita. Digitaalisilla kanavilla usein läsnäoleva nimettömyys ja tiedon tarkastamisen mahdottomuus tarkoittavat että kenellä tahansa voi olla mielipide asioista, ja mikä tahansa viesti voi muuttua, jopa alkuperäiseen tarkoitukseen nähden vastakkaiseksi. FM 3-05.301:n metodi toimii kybertoimintaympäristössä, vaikkakin jotkin osa-alueet ovat vanhentuneita ja siksi ne esitetään päivitettäväksi mikäli metodia käytetään kyseisessä ympäristössä. Kohdan kaksi kohdeyleisökategoriat korvattiin uusilla, digitaalisessa ympäristössä esiintyvillä ryhmillä. Lähestyttävyyden arviointi (kohta 8) muotoiltiin myös uudestaan, koska digitaalisessa mediassa pelkkä tekstin läsnäolo ei sellaisenaan tyypillisesti vielä riitä halutun viestin välittämiseen kohdeyleisölle. Tietotekniikan edistyminen ja psykologian sekä sosiologian aloilla tehty tieteellinen tutkimus ihmisten käyttäytymisestä sosiaalisessa mediassa (ja yleensä kybertoimintaympäristössä) mahdollistavat koko kohdeyleisöanalyysiprosessin uudelleenrakentamisen. Tässä työssä sitä kuitenkaan ei voida tehdä. Siksi esitetäänkin että lisätutkimusta tulisi tehdä sekä tietokoneavusteisten prosessien että vielä syvällisempien kohdeyleisöanalyysien osalta, käyttäen hyväksi viimeisimpiä ihmisen käyttäytymiseen liittyviä tutkimustuloksia.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Presentation at Open Repositories 2014, Helsinki, Finland, June 9-13, 2014

Relevância:

10.00% 10.00%

Publicador:

Resumo:

In the doctoral dissertation, low-voltage direct current (LVDC) distribution system stability, supply security and power quality are evaluated by computational modelling and measurements on an LVDC research platform. Computational models for the LVDC network analysis are developed. Time-domain simulation models are implemented in the time-domain simulation environment PSCAD/EMTDC. The PSCAD/EMTDC models of the LVDC network are applied to the transient behaviour and power quality studies. The LVDC network power loss model is developed in a MATLAB environment and is capable of fast estimation of the network and component power losses. The model integrates analytical equations that describe the power loss mechanism of the network components with power flow calculations. For an LVDC network research platform, a monitoring and control software solution is developed. The solution is used to deliver measurement data for verification of the developed models and analysis of the modelling results. In the work, the power loss mechanism of the LVDC network components and its main dependencies are described. Energy loss distribution of the LVDC network components is presented. Power quality measurements and current spectra are provided and harmonic pollution on the DC network is analysed. The transient behaviour of the network is verified through time-domain simulations. DC capacitor guidelines for an LVDC power distribution network are introduced. The power loss analysis results show that one of the main optimisation targets for an LVDC power distribution network should be reduction of the no-load losses and efficiency improvement of converters at partial loads. Low-frequency spectra of the network voltages and currents are shown, and harmonic propagation is analysed. Power quality in the LVDC network point of common coupling (PCC) is discussed. Power quality standard requirements are shown to be met by the LVDC network. The network behaviour during transients is analysed by time-domain simulations. The network is shown to be transient stable during large-scale disturbances. Measurement results on the LVDC research platform proving this are presented in the work.