13 resultados para Link quality estimation
em Doria (National Library of Finland DSpace Services) - National Library of Finland, Finland
Resumo:
In this thesis programmatic, application-layer means for better energy-efficiency in the VoIP application domain are studied. The work presented concentrates on optimizations which are suitable for VoIP-implementations utilizing SIP and IEEE 802.11 technologies. Energy-saving optimizations can have an impact on perceived call quality, and thus energy-saving means are studied together with those factors affecting perceived call quality. In this thesis a general view on a topic is given. Based on theory, adaptive optimization schemes for dynamic controlling of application's operation are proposed. A runtime quality model, capable of being integrated into optimization schemes, is developed for VoIP call quality estimation. Based on proposed optimization schemes, some power consumption measurements are done to find out achievable advantages. Measurement results show that a reduction in power consumption is possible to achieve with the help of adaptive optimization schemes.
Resumo:
Technical analysis of Low Voltage Direct Current (LVDC) distribution systems shows that in LVDC transmission the customer voltage quality is higher. One of the problems in LVDC distribution networks that converters both ends of the DC line are required. Because of the converters produce not pure DC voltage, but some fluctuations as well, the huge electrolytic capacitors are required to reduce voltage distortions in the DC-side. This thesis master’s thesis is focused on calculating required DC-link capacitance for LVDC transmission and estimation of the influence of different parameters on the voltage quality. The goal is to investigate the methods of the DC-link capacitance estimation and location in the transmission line.
Resumo:
The wide adaptation of Internet Protocol (IP) as de facto protocol for most communication networks has established a need for developing IP capable data link layer protocol solutions for Machine to machine (M2M) and Internet of Things (IoT) networks. However, the wireless networks used for M2M and IoT applications usually lack the resources commonly associated with modern wireless communication networks. The existing IP capable data link layer solutions for wireless IoT networks provide the necessary overhead minimising and frame optimising features, but are often built to be compatible only with IPv6 and specific radio platforms. The objective of this thesis is to design IPv4 compatible data link layer for Netcontrol Oy's narrow band half-duplex packet data radio system. Based on extensive literature research, system modelling and solution concept testing, this thesis proposes the usage of tunslip protocol as the basis for the system data link layer protocol development. In addition to the functionality of tunslip, this thesis discusses the additional network, routing, compression, security and collision avoidance changes required to be made to the radio platform in order for it to be IP compatible while still being able to maintain the point-to-multipoint and multi-hop network characteristics. The data link layer design consists of the radio application, dynamic Maximum Transmission Unit (MTU) optimisation daemon and the tunslip interface. The proposed design uses tunslip for creating an IP capable data link protocol interface. The radio application receives data from tunslip and compresses the packets and uses the IP addressing information for radio network addressing and routing before forwarding the message to radio network. The dynamic MTU size optimisation daemon controls the tunslip interface maximum MTU size according to the link quality assessment calculated from the radio network diagnostic data received from the radio application. For determining the usability of tunslip as the basis for data link layer protocol, testing of the tunslip interface is conducted with both IEEE 802.15.4 radios and packet data radios. The test cases measure the radio network usability for User Datagram Protocol (UDP) based applications without applying any header or content compression. The test results for the packet data radios reveal that the typical success rate for packet reception through a single-hop link is above 99% with a round-trip-delay of 0.315s for 63B packets.
Resumo:
Diplomityö on tehty Lappeenrannan teknillisessä korkeakoulussa Konepajatekniikan laitoksella. Työ on osa Konepajatekniikan laitoksen toteuttamaa ”LELA” levytuotteiden laadunvalvontaprojektia. Projektiosapuolet olivat: Abloy Oy (Joensuu), Flextronics Enclosures (Oulu), Hihra Oy (Turku), Lillbacka Oy (Alahärmä), Nokia Networks (Oulu), Segerström & Svensson (Uusikaupunki) ja Scanfil Oy (Sievi). Lisäksi projektin pääasiallisena rahoittavana osapuolena toimi Teknologian kehittämiskeskus, Tekes. Työ perustui ohutlevykomponentteja valmistavissa kohdeyrityksissä suoritettuun laadunarviointitutkimukseen, joka sisälsi tuotantoon kohdistetun virhekartoituksen sekä laadunarviointikyselyn. Tutkimuksessa suoritetun tuotannon virhekartoituksen mukaan tyypillisessä ohutlevykomponentteja valmistavassa yrityksessä virheidenesiintymistodennäköisyys on n. 5 – 12 % / tuote. Eniten virheitä syntyi epäkeskopuristintöissä. Tutkimus osoittaa, että virhekartoituksen avulla yrityksen on mahdollista kartoittaa tuotannossa syntyvät virheet ja edelleen kohdistaa laadun kehitystoimenpiteet oikeisiin tuotannon osa-alueisiin.
Resumo:
Kuvien laatu on tutkituimpia ja käytetyimpiä aiheita. Tässä työssä tarkastellaan värin laatu ja spektrikuvia. Työssä annetaan yleiskuva olemassa olevista pakattujen ja erillisten kuvien laadunarviointimenetelmistä painottaen näiden menetelmien soveltaminen spektrikuviin. Tässä työssä esitellään spektriväriulkomuotomalli värikuvien laadunarvioinnille. Malli sovelletaan spektrikuvista jäljennettyihin värikuviin. Malli pohjautuu sekä tilastolliseen spektrikuvamalliin, joka muodostaa yhteyden spektrikuvien ja valokuvien parametrien välille, että kuvan yleiseen ulkomuotoon. Värikuvien tilastollisten spektriparametrien ja fyysisten parametrien välinen yhteys on varmennettu tietokone-pohjaisella kuvamallinnuksella. Mallin ominaisuuksien pohjalta on kehitetty koekäyttöön tarkoitettu menetelmä värikuvien laadunarvioinnille. On kehitetty asiantuntija-pohjainen kyselymenetelmä ja sumea päättelyjärjestelmä värikuvien laadunarvioinnille. Tutkimus osoittaa, että spektri-väri –yhteys ja sumea päättelyjärjestelmä soveltuvat tehokkaasti värikuvien laadunarviointiin.
Resumo:
Nowadays software testing and quality assurance have a great value in software development process. Software testing does not mean a concrete discipline, it is the process of validation and verification that starts from the idea of future product and finishes at the end of product’s maintenance. The importance of software testing methods and tools that can be applied on different testing phases is highly stressed in industry. The initial objectives for this thesis were to provide a sufficient literature review on different testing phases and for each of the phases define the method that can be effectively used for improving software’s quality. Software testing phases, chosen for study are: unit testing, integration testing, functional testing, system testing, acceptance testing and usability testing. The research showed that there are many software testing methods that can be applied at different phases and in the most of the cases the choice of the method should be done depending on software type and its specification. In the thesis the problem, concerned to each of the phases was identified; the method that can help in eliminating this problem was suggested and particularly described.
Resumo:
Line converters have become an attractive AC/DC power conversion solution in industrial applications. Line converters are based on controllable semiconductor switches, typically insulated gate bipolar transistors. Compared to the traditional diode bridge-based power converters line converters have many advantageous characteristics, including bidirectional power flow, controllable de-link voltage and power factor and sinusoidal line current. This thesis considers the control of the lineconverter and its application to power quality improving. The line converter control system studied is based on the virtual flux linkage orientation and the direct torque control (DTC) principle. A new DTC-based current control scheme is introduced and analyzed. The overmodulation characteristics of the DTC converter are considered and an analytical equation for the maximum modulation index is derived. The integration of the active filtering features to the line converter isconsidered. Three different active filtering methods are implemented. A frequency-domain method, which is based on selective harmonic sequence elimination, anda time-domain method, which is effective in a wider frequency band, are used inharmonic current compensation. Also, a voltage feedback active filtering method, which mitigates harmonic sequences of the grid voltage, is implemented. The frequency-domain and the voltage feedback active filtering control systems are analyzed and controllers are designed. The designs are verified with practical measurements. The performance and the characteristics of the implemented active filtering methods are compared and the effect of the L- and the LCL-type line filteris discussed. The importance of the correct grid impedance estimate in the voltage feedback active filter control system is discussed and a new measurement-based method to obtain it is proposed. Also, a power conditioning system (PCS) application of the line converter is considered. A new method for correcting the voltage unbalance of the PCS-fed island network is proposed and experimentally validated.
Resumo:
Software engineering is criticized as not being engineering or 'well-developed' science at all. Software engineers seem not to know exactly how long their projects will last, what they will cost, and will the software work properly after release. Measurements have to be taken in software projects to improve this situation. It is of limited use to only collect metrics afterwards. The values of the relevant metrics have to be predicted, too. The predictions (i.e. estimates) form the basis for proper project management. One of the most painful problems in software projects is effort estimation. It has a clear and central effect on other project attributes like cost and schedule, and to product attributes like size and quality. Effort estimation can be used for several purposes. In this thesis only the effort estimation in software projects for project management purposes is discussed. There is a short introduction to the measurement issues, and some metrics relevantin estimation context are presented. Effort estimation methods are covered quite broadly. The main new contribution in this thesis is the new estimation model that has been created. It takes use of the basic concepts of Function Point Analysis, but avoids the problems and pitfalls found in the method. It is relativelyeasy to use and learn. Effort estimation accuracy has significantly improved after taking this model into use. A major innovation related to the new estimationmodel is the identified need for hierarchical software size measurement. The author of this thesis has developed a three level solution for the estimation model. All currently used size metrics are static in nature, but this new proposed metric is dynamic. It takes use of the increased understanding of the nature of the work as specification and design work proceeds. It thus 'grows up' along with software projects. The effort estimation model development is not possible without gathering and analyzing history data. However, there are many problems with data in software engineering. A major roadblock is the amount and quality of data available. This thesis shows some useful techniques that have been successful in gathering and analyzing the data needed. An estimation process is needed to ensure that methods are used in a proper way, estimates are stored, reported and analyzed properly, and they are used for project management activities. A higher mechanism called measurement framework is also introduced shortly. The purpose of the framework is to define and maintain a measurement or estimationprocess. Without a proper framework, the estimation capability of an organization declines. It requires effort even to maintain an achieved level of estimationaccuracy. Estimation results in several successive releases are analyzed. It isclearly seen that the new estimation model works and the estimation improvementactions have been successful. The calibration of the hierarchical model is a critical activity. An example is shown to shed more light on the calibration and the model itself. There are also remarks about the sensitivity of the model. Finally, an example of usage is shown.
Resumo:
General Packet Radio Service (GPRS) mahdollistaa pakettimuotoisen tiedonsiirron GSM-verkossa. Se tarjoaa yhteyden pakettidataverkkoihin, nostaen samalla tiedonsiirtonopeutta radiorajapinnassa. Radioresurssit ovat varattuna vain silloin kun on jotain lähetettävää, tehden täten radioresurssien käytön paljon tehokkaammaksi. Tämä diplomityö keskittyy GPRS protokollaan ja erityisesti sen datapinossa olevaan Radio Link Control (RLC) kerrokseen. RLC-kerros huolehtii GPRS- puhelimen ja tukiaseman välisen yhteyden luotettavuudesta. Työn tavoitteena on tutkia RLC-kerroksen toiminnallisuutta ja sen luotettavuutta heikossa kentässä, sekä selvittää heikon kentän vaikutusta uudelleenlähetyksiin. Työn tuloksena saadaan arvio signaalin voimakkuuden sekä uudelleen lähetysten vaikutuksesta GPRS:n datansiirtonopeuteen. Tämä työ käsittelee myös lyhyesti GSM-järjestelmää, koska lukijan on näin helpompaa ymmärtää myös GPRS-järjestelmän vaatimia teknisiä muutoksia. Tämä diplomityö on tehty osana Nokia Matkapuhelimet Oyj:ssä käynnissä olevaa GPRS tuotekehitysprojektia. Työn tuloksia käytetään testauksen tukena ja niitä on käytetty apuna RLC-kerroksen luotettavuustestauksen suunnittelussa.
Resumo:
Construction of multiple sequence alignments is a fundamental task in Bioinformatics. Multiple sequence alignments are used as a prerequisite in many Bioinformatics methods, and subsequently the quality of such methods can be critically dependent on the quality of the alignment. However, automatic construction of a multiple sequence alignment for a set of remotely related sequences does not always provide biologically relevant alignments.Therefore, there is a need for an objective approach for evaluating the quality of automatically aligned sequences. The profile hidden Markov model is a powerful approach in comparative genomics. In the profile hidden Markov model, the symbol probabilities are estimated at each conserved alignment position. This can increase the dimension of parameter space and cause an overfitting problem. These two research problems are both related to conservation. We have developed statistical measures for quantifying the conservation of multiple sequence alignments. Two types of methods are considered, those identifying conserved residues in an alignment position, and those calculating positional conservation scores. The positional conservation score was exploited in a statistical prediction model for assessing the quality of multiple sequence alignments. The residue conservation score was used as part of the emission probability estimation method proposed for profile hidden Markov models. The results of the predicted alignment quality score highly correlated with the correct alignment quality scores, indicating that our method is reliable for assessing the quality of any multiple sequence alignment. The comparison of the emission probability estimation method with the maximum likelihood method showed that the number of estimated parameters in the model was dramatically decreased, while the same level of accuracy was maintained. To conclude, we have shown that conservation can be successfully used in the statistical model for alignment quality assessment and in the estimation of emission probabilities in the profile hidden Markov models.
Resumo:
As primary objective, this thesis examines Finnair Technical Procurement’s service quality with its underlying process. As an internal unit, Technical Procurement serves as a link between external suppliers and internal customers. It is argued that external service quality requires a certain quality level within an organization. At the same time, aircraft maintenance business is subject to economic restraints. Therefore, a methodology was developed with a modified House of Quality that assists management in analyzing and evaluating Technical Procurement’s service level and connected process steps. It could be shown that qualitative and quantitative objectives do not exclude each other per se.
Resumo:
Increased emissions of greenhouse gases into the atmosphere are causing an anthropogenic climate change. The resulting global warming challenges the ability of organisms to adapt to the new temperature conditions. However, warming is not the only major threat. In marine environments, dissolution of carbon dioxide from the atmosphere causes a decrease in surface water pH, the so called ocean acidification. The temperature and acidification effects can interact, and create even larger problems for the marine flora and fauna than either of the effects would cause alone. I have used Baltic calanoid copepods (crustacean zooplankton) as my research object and studied their growth and stress responses using climate predictions projected for the next century. I have studied both direct temperature and pH effects on copepods, and indirect effects via their food: the changing phytoplankton spring bloom composition and toxic cyanobacterium. The main aims of my thesis were: 1) to find out how warming and acidification combined with a toxic cyanobacterium affect copepod reproductive success (egg production, egg viability, egg hatching success, offspring development) and oxidative balance (antioxidant capacity, oxidative damage), and 2) to reveal the possible food quality effects of spring phytoplankton bloom composition dominated by diatoms or dinoflagellates on reproducing copepods (egg production, egg hatching, RNA:DNA ratio). The two copepod genera used, Acartia sp. and Eurytemora affinis are the dominating mesozooplankton taxa (0.2 – 2 mm) in my study area the Gulf of Finland. The 20°C temperature seems to be within the tolerance limits of Acartia spp., because copepods can adapt to the temperature phenotypically by adjusting their body size. Copepods are also able to tolerate a pH decrease of 0.4 from present values, but the combination of warm water and decreased pH causes problems for them. In my studies, the copepod oxidative balance was negatively influenced by the interaction of these two environmental factors, and egg and nauplii production were lower at 20°C and lower pH, than at 20°C and ambient pH. However, presence of toxic cyanobacterium Nodularia spumigena improved the copepod oxidative balance and helped to resist the environmental stress, in question. In addition, adaptive maternal effects seem to be an important adaptation mechanism in a changing environment, but it depends on the condition of the female copepod and her diet how much she can invest in her offspring. I did not find systematic food quality difference between diatoms and dinoflagellates. There are both good and bad diatom and dinoflagellate species. Instead, the dominating species in the phytoplankton bloom composition has a central role in determining the food quality, although copepods aim at obtaining as a balanced diet as possible by foraging on several species. If the dominating species is of poor quality it can cause stress when ingested, or lead to non-optimal foraging if rejected. My thesis demonstrates that climate change induced water temperature and pH changes can cause problems to Baltic Sea copepod communities. However, their resilience depends substantially on their diet, and therefore the response of phytoplankton to the environmental changes. As copepods are an important link in pelagic food webs, their future success can have far reaching consequences, for example on fish stocks.
Resumo:
Since its discovery, chaos has been a very interesting and challenging topic of research. Many great minds spent their entire lives trying to give some rules to it. Nowadays, thanks to the research of last century and the advent of computers, it is possible to predict chaotic phenomena of nature for a certain limited amount of time. The aim of this study is to present a recently discovered method for the parameter estimation of the chaotic dynamical system models via the correlation integral likelihood, and give some hints for a more optimized use of it, together with a possible application to the industry. The main part of our study concerned two chaotic attractors whose general behaviour is diff erent, in order to capture eventual di fferences in the results. In the various simulations that we performed, the initial conditions have been changed in a quite exhaustive way. The results obtained show that, under certain conditions, this method works very well in all the case. In particular, it came out that the most important aspect is to be very careful while creating the training set and the empirical likelihood, since a lack of information in this part of the procedure leads to low quality results.