13 resultados para Packet Filtering

em Helda - Digital Repository of University of Helsinki


Relevância:

20.00% 20.00%

Publicador:

Resumo:

The stochastic filtering has been in general an estimation of indirectly observed states given observed data. This means that one is discussing conditional expected values as being one of the most accurate estimation, given the observations in the context of probability space. In my thesis, I have presented the theory of filtering using two different kind of observation process: the first one is a diffusion process which is discussed in the first chapter, while the third chapter introduces the latter which is a counting process. The majority of the fundamental results of the stochastic filtering is stated in form of interesting equations, such the unnormalized Zakai equation that leads to the Kushner-Stratonovich equation. The latter one which is known also by the normalized Zakai equation or equally by Fujisaki-Kallianpur-Kunita (FKK) equation, shows the divergence between the estimate using a diffusion process and a counting process. I have also introduced an example for the linear gaussian case, which is mainly the concept to build the so-called Kalman-Bucy filter. As the unnormalized and the normalized Zakai equations are in terms of the conditional distribution, a density of these distributions will be developed through these equations and stated by Kushner Theorem. However, Kushner Theorem has a form of a stochastic partial differential equation that needs to be verify in the sense of the existence and uniqueness of its solution, which is covered in the second chapter.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Autism and Asperger syndrome (AS) are neurodevelopmental disorders characterised by deficient social and communication skills, as well as restricted, repetitive patterns of behaviour. The language development in individuals with autism is significantly delayed and deficient, whereas in individuals with AS, the structural aspects of language develop quite normally. Both groups, however, have semantic-pragmatic language deficits. The present thesis investigated auditory processing in individuals with autism and AS. In particular, the discrimination of and orienting to speech and non-speech sounds was studied, as well as the abstraction of invariant sound features from speech-sound input. Altogether five studies were conducted with auditory event-related brain potentials (ERP); two studies also included a behavioural sound-identification task. In three studies, the subjects were children with autism, in one study children with AS, and in one study adults with AS. In children with autism, even the early stages of sound encoding were deficient. In addition, these children had altered sound-discrimination processes characterised by enhanced spectral but deficient temporal discrimination. The enhanced pitch discrimination may partly explain the auditory hypersensitivity common in autism, and it may compromise the filtering of relevant auditory information from irrelevant information. Indeed, it was found that when sound discrimination required abstracting invariant features from varying input, children with autism maintained their superiority in pitch processing, but lost it in vowel processing. Finally, involuntary orienting to sound changes was deficient in children with autism in particular with respect to speech sounds. This finding is in agreement with previous studies on autism suggesting deficits in orienting to socially relevant stimuli. In contrast to children with autism, the early stages of sound encoding were fairly unimpaired in children with AS. However, sound discrimination and orienting were rather similarly altered in these children as in those with autism, suggesting correspondences in the auditory phenotype in these two disorders which belong to the same continuum. Unlike children with AS, adults with AS showed enhanced processing of duration changes, suggesting developmental changes in auditory processing in this disorder.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

The quantification and characterisation of soil phosphorus (P) is of agricultural and environmental importance and different extraction methods are widely used to asses the bioavailability of P and to characterize soil P reserves. However, the large variety of extractants, pre-treatments and sample preparation procedures complicate the comparison of published results. In order to improve our understanding of the behaviour and cycling of P in soil, it is crucial to know the scientific relevance of the methods used for various purposes. The knowledge of the factors affecting the analytical outcome is a prerequisite for justified interpretation of the results. The aim of this thesis was to study the effects of sample preparation procedures on soil P and to determine the dependence of the recovered P pool on the chemical nature of extractants. Sampling is a critical step in soil testing and sampling strategy is dependent on the land-use history and the purpose of sampling. This study revealed that pre-treatments changed soil properties and air-drying was found to affect soil P, particularly extractable organic P, by disrupting organic matter. This was evidenced by an increase in the water-extractable small-sized (<0.2 µm) P that, at least partly, took place at the expense of the large-sized (>0.2 µm) P. However, freezing induced only insignificant changes and thus, freezing can be taken to be a suitable method for storing soils from the boreal zone that naturally undergo periodic freezing. The results demonstrated that chemical nature of the extractant affects its sensitivity to detect changes in soil P solubility. Buffered extractants obscured the alterations in P solubility induced by pH changes; however, water extraction, though sensitive to physicochemical changes, can be used to reveal short term changes in soil P solubility. As for the organic P, the analysis was found to be sensitive to the sample preparation procedures: filtering may leave a large proportion of extractable organic P undetected, whereas the outcome of centrifugation was found to be affected by the ionic strength of the extractant. Widely used sequential fractionation procedures proved to be able to detect land-use -derived differences in the distribution of P among fractions of different solubilities. However, interpretation of the results from extraction experiments requires better understanding of the biogeochemical function of the recovered P fraction in the P cycle in differently managed soils under dissimilar climatic conditions.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Wireless technologies are continuously evolving. Second generation cellular networks have gained worldwide acceptance. Wireless LANs are commonly deployed in corporations or university campuses, and their diffusion in public hotspots is growing. Third generation cellular systems are yet to affirm everywhere; still, there is an impressive amount of research ongoing for deploying beyond 3G systems. These new wireless technologies combine the characteristics of WLAN based and cellular networks to provide increased bandwidth. The common direction where all the efforts in wireless technologies are headed is towards an IP-based communication. Telephony services have been the killer application for cellular systems; their evolution to packet-switched networks is a natural path. Effective IP telephony signaling protocols, such as the Session Initiation Protocol (SIP) and the H 323 protocol are needed to establish IP-based telephony sessions. However, IP telephony is just one service example of IP-based communication. IP-based multimedia sessions are expected to become popular and offer a wider range of communication capabilities than pure telephony. In order to conjoin the advances of the future wireless technologies with the potential of IP-based multimedia communication, the next step would be to obtain ubiquitous communication capabilities. According to this vision, people must be able to communicate also when no support from an infrastructured network is available, needed or desired. In order to achieve ubiquitous communication, end devices must integrate all the capabilities necessary for IP-based distributed and decentralized communication. Such capabilities are currently missing. For example, it is not possible to utilize native IP telephony signaling protocols in a totally decentralized way. This dissertation presents a solution for deploying the SIP protocol in a decentralized fashion without support of infrastructure servers. The proposed solution is mainly designed to fit the needs of decentralized mobile environments, and can be applied to small scale ad-hoc networks or also bigger networks with hundreds of nodes. A framework allowing discovery of SIP users in ad-hoc networks and the establishment of SIP sessions among them, in a fully distributed and secure way, is described and evaluated. Security support allows ad-hoc users to authenticate the sender of a message, and to verify the integrity of a received message. The distributed session management framework has been extended in order to achieve interoperability with the Internet, and the native Internet applications. With limited extensions to the SIP protocol, we have designed and experimentally validated a SIP gateway allowing SIP signaling between ad-hoc networks with private addressing space and native SIP applications in the Internet. The design is completed by an application level relay that permits instant messaging sessions to be established in heterogeneous environments. The resulting framework constitutes a flexible and effective approach for the pervasive deployment of real time applications.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

The TCP protocol is used by most Internet applications today, including the recent mobile wireless terminals that use TCP for their World-Wide Web, E-mail and other traffic. The recent wireless network technologies, such as GPRS, are known to cause delay spikes in packet transfer. This causes unnecessary TCP retransmission timeouts. This dissertation proposes a mechanism, Forward RTO-Recovery (F-RTO) for detecting the unnecessary TCP retransmission timeouts and thus allow TCP to take appropriate follow-up actions. We analyze a Linux F-RTO implementation in various network scenarios and investigate different alternatives to the basic algorithm. The second part of this dissertation is focused on quickly adapting the TCP's transmission rate when the underlying link characteristics change suddenly. This can happen, for example, due to vertical hand-offs between GPRS and WLAN wireless technologies. We investigate the Quick-Start algorithm that, in collaboration with the network routers, aims to quickly probe the available bandwidth on a network path, and allow TCP's congestion control algorithms to use that information. By extensive simulations we study the different router algorithms and parameters for Quick-Start, and discuss the challenges Quick-Start faces in the current Internet. We also study the performance of Quick-Start when applied to vertical hand-offs between different wireless link technologies.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Wireless access is expected to play a crucial role in the future of the Internet. The demands of the wireless environment are not always compatible with the assumptions that were made on the era of the wired links. At the same time, new services that take advantage of the advances in many areas of technology are invented. These services include delivery of mass media like television and radio, Internet phone calls, and video conferencing. The network must be able to deliver these services with acceptable performance and quality to the end user. This thesis presents an experimental study to measure the performance of bulk data TCP transfers, streaming audio flows, and HTTP transfers which compete the limited bandwidth of the GPRS/UMTS-like wireless link. The wireless link characteristics are modeled with a wireless network emulator. We analyze how different competing workload types behave with regular TPC and how the active queue management, the Differentiated services (DiffServ), and a combination of TCP enhancements affect the performance and the quality of service. We test on four link types including an error-free link and the links with different Automatic Repeat reQuest (ARQ) persistency. The analysis consists of comparing the resulting performance in different configurations based on defined metrics. We observed that DiffServ and Random Early Detection (RED) with Explicit Congestion Notification (ECN) are useful, and in some conditions necessary, for quality of service and fairness because a long queuing delay and congestion related packet losses cause problems without DiffServ and RED. However, we observed situations, where there is still room for significant improvements if the link-level is aware of the quality of service. Only very error-prone link diminishes the benefits to nil. The combination of TCP enhancements improves performance. These include initial window of four, Control Block Interdependence (CBI) and Forward RTO recovery (F-RTO). The initial window of four helps a later starting TCP flow to start faster but generates congestion under some conditions. CBI prevents slow-start overshoot and balances slow start in the presence of error drops, and F-RTO reduces unnecessary retransmissions successfully.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

We investigate methods for recommending multimedia items suitable for an online multimedia sharing community and introduce a novel algorithm called UserRank for ranking multimedia items based on link analysis. We also take the initiative of applying EigenRumor from the domain of blogosphere to multimedia. Furthermore, we present a strategy for making personalized recommendation that combines UserRank with collaborative filtering. We evaluate our method with an informal user study and show that results obtained are promising.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

With the proliferation of wireless and mobile devices equipped with multiple radio interfaces to connect to the Internet, vertical handoff involving different wireless access technologies will enable users to get the best of connectivity and service quality during the lifetime of a TCP connection. A vertical handoff may introduce an abrupt, significant change in the access link characteristics and as a result the end-to-end path characteristics such as the bandwidth and the round-trip time (RTT) of a TCP connection may change considerably. TCP may take several RTTs to adapt to these changes in path characteristics and during this interval there may be packet losses and / or inefficient utilization of the available bandwidth. In this thesis we study the behaviour and performance of TCP in the presence of a vertical handoff. We identify the different handoff scenarios that adversely affect TCP performance. We propose several enhancements to the TCP sender algorithm that are specific to the different handoff scenarios to adapt TCP better to a vertical handoff. Our algorithms are conservative in nature and make use of cross-layer information obtained from the lower layers regarding the characteristics of the access links involved in a handoff. We evaluate the proposed algorithms by extensive simulation of the various handoff scenarios involving access links with a wide range of bandwidth and delay. We show that the proposed algorithms are effective in improving the TCP behaviour in various handoff scenarios and do not adversely affect the performance of TCP in the absence of cross-layer information.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Atrial fibrillation (AF) is the most common tachyarrhythmia and is associated with substantial morbidity, increased mortality and cost. The treatment modalities of AF have increased, but results are still far from optimal. More individualized therapy may be beneficial. Aiming for this calls improved diagnostics. Aim of this study was to find non-invasive parameters obtained during sinus rhythm reflecting electrophysiological patterns related to propensity to AF and particularly to AF occurring without any associated heart disease, lone AF. Overall 240 subjects were enrolled, 136 patients with paroxysmal lone AF and 104 controls (mean age 45 years, 75% males). Signal measurements were performed by non-invasive magnetocardiography (MCG) and by invasive electroanatomic mapping (EAM). High-pass filtering techniques and a new method based on a surface gradient technique were adapted to analyze atrial MCG signal. The EAM was used to elucidate atrial activation in patients and as a reference for MCG. The results showed that MCG mapping is an accurate method to detect atrial electrophysiologic properties. In lone paroxysmal AF, duration of the atrial depolarization complex was marginally prolonged. The difference was more obvious in women and was also related to interatrial conduction patterns. In the focal type of AF (75%), the root mean square (RMS) amplitudes of the atrial signal were normal, but in AF without demonstrable triggers the late atrial RMS amplitudes were reduced. In addition, the atrial characteristics tended to remain similar even when examined several years after the first AF episodes. The intra-atrial recordings confirmed the occurrence of three distinct sites of electrical connection from right to left atrium (LA): the Bachmann bundle (BB), the margin of the fossa ovalis (FO), and the coronary sinus ostial area (CS). The propagation of atrial signal could also be evaluated non-invasively. Three MCG atrial wave types were identified, each of which represented a distinct interatrial activation pattern. In conclusion, in paroxysmal lone AF, active focal triggers are common, atrial depolarization is slightly prolonged, but with a normal amplitude, and the arrhythmia does not necessarily lead to electrical or mechanical dysfunction of the atria. In women the prolongation of atrial depolarization is more obvious. This may be related to gender differences in presentation of AF. A significant minority of patients with lone AF lack frequent focal triggers, and in them, the late atrial signal amplitude is reduced, possibly signifying a wider degenerative process in the LA. In lone AF, natural impulse propagation to LA during sinus rhythm goes through one or more of the principal pathways described. The BB is the most common route, but in one-third, the earliest LA activation occurs outside the BB. Susceptibility to paroxysmal lone AF is associated with propagation of the atrial signal via the margin of the FO or via multiple pathways. When conduction occurs via the BB, it is related with prolonged atrial activation. Thus, altered and alternative conduction pathways may contribute to pathogenesis of lone AF. There is growing evidence of variability in genesis of AF also within lone paroxysmal AF. Present study suggests that this variation may be reflected in cardiac signal pattern. Recognizing the distinct signal profiles may assist in understanding the pathogenesis of AF and identifying subgroups for patient-tailored therapy.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

The magnetic field of the Earth is 99 % of the internal origin and generated in the outer liquid core by the dynamo principle. In the 19th century, Carl Friedrich Gauss proved that the field can be described by a sum of spherical harmonic terms. Presently, this theory is the basis of e.g. IGRF models (International Geomagnetic Reference Field), which are the most accurate description available for the geomagnetic field. In average, dipole forms 3/4 and non-dipolar terms 1/4 of the instantaneous field, but the temporal mean of the field is assumed to be a pure geocentric axial dipolar field. The validity of this GAD (Geocentric Axial Dipole) hypothesis has been estimated by using several methods. In this work, the testing rests on the frequency dependence of inclination with respect to latitude. Each combination of dipole (GAD), quadrupole (G2) and octupole (G3) produces a distinct inclination distribution. These theoretical distributions have been compared with those calculated from empirical observations from different continents, and last, from the entire globe. Only data from Precambrian rocks (over 542 million years old) has been used in this work. The basic assumption is that during the long-term course of drifting continents, the globe is sampled adequately. There were 2823 observations altogether in the paleomagnetic database of the University of Helsinki. The effect of the quality of observations, as well as the age and rocktype, has been tested. For comparison between theoretical and empirical distributions, chi-square testing has been applied. In addition, spatiotemporal binning has effectively been used to remove the errors caused by multiple observations. The modelling from igneous rock data tells that the average magnetic field of the Earth is best described by a combination of a geocentric dipole and a very weak octupole (less than 10 % of GAD). Filtering and binning gave distributions a more GAD-like appearance, but deviation from GAD increased as a function of the age of rocks. The distribution calculated from so called keypoles, the most reliable determinations, behaves almost like GAD, having a zero quadrupole and an octupole 1 % of GAD. In no earlier study, past-400-Ma rocks have given a result so close to GAD, but low inclinations have been prominent especially in the sedimentary data. Despite these results, a greater deal of high-quality data and a proof of the long-term randomness of the Earth's continental motions are needed to make sure the dipole model holds true.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Väärinkäytettyjen aineiden seulontaan käytetyn menetelmän tulee olla herkkä, selektiivinen, yksinkertainen, nopea ja toistettava. Työn tavoitteena oli kehittää yksinkertainen, mutta herkkä, esikäsittelymenetelmä bentsodiatsepiinien ja amfetamiinijohdannaisten kvalitatiiviseen seulomiseen virtsasta mikropilarisähkösumutussirun (μPESI) avulla, mikä tarjoaisi vaihtoehdon seulonnassa käytetyille immunologisille menetelmille, joiden herkkyys ja selektiivisyys ovat puutteellisia. Tavoitteena oli samalla tarkastella mikropilarisähkösumutussirun toimivuutta biologisten näytteiden analyysissa. Esikäsittely optimoitiin erikseen bentsodiatsepiineille ja amfetamiinijohdannaisille. Käytettyjä esikäsittelymenetelmiä olivat neste-nesteuutto, kiinteäfaasiuutto Oasis HLB-patruunalla ja ZipTip®-pipetinkärjellä sekä laimennus ja suodatus ilman uuttoa. Mittausten perusteella keskityttiin optimoimaan ZipTip®-uuttoa. Optimoinnissa tutkittavia yhdisteitä spiikattiin 0-virtsaan niiden ennaltamääritetyn raja-arvon verran, bentsodiatsepiineja 200 ng/ml ja amfetamiinijohdannaisia 300 ng/ml. Bentsodiatsepiinien kohdalla optimoitiin kutakin uuton vaihetta ja optimoinnin tuloksena näytteen pH säädettiin arvoon 5, faasi kunnostettiin asetonitriililla, tasapainotettiin ja pestiin veden (pH 5) ja asetonitriilin (10 % v/v) seoksella ja eluoitiin asetonitriilin, muurahaishapon ja veden (95:1:4 v/v/v) seoksella. Amfetamiinijohdannaisten uutossa optimoitiin näytteen ja liuottimien pH-arvoja ja tuloksena näytteen pH säädettiin arvoon 10, faasi kunnostettiin veden ja ammoniumvetykarbonaatin(pH 10, 1:1 v/v) seoksella, tasapainotettiin ja pestiin asetonitriilin ja veden (1:5 v/v) seoksella ja eluoitiin metanolilla. Optimoituja uuttoja testattiin Yhtyneet Medix Laboratorioista toimitetuilla autenttisilla virtsanäytteillä ja saatuja tuloksia verrattiin kvantitatiivisen GC/MS-analyysin tuloksiin. Bentsodiatsepiininäytteet hydrolysoitiin ennen uuttoa herkkyyden parantamiseksi. Autenttiset näytteet analysoitiin Q-TOF-laitteella Viikissä. Lisäksi hydrolysoidut bentsodiatsepiininäytteet mitattiin Yhtyneet Medix Laboratorioiden TOF-laitteella. Kehitetty menetelmä vaatii tulosten perusteella lisää optimointia toimiakseen. Ongelmana oli etenkin toistoissa ilmennyt tulosten hajonta. Manuaalista näytteensyöttöä tulisi kehittää toistettavammaksi. Autenttisten bentsodiatsepiininäytteiden analyysissa ongelmana olivat virheelliset negatiiviset tulokset ja amfetamiinijohdannaisten analyysissa virheelliset positiiviset tulokset. Virheellisiä negatiivisia tuloksia selittää menetelmän herkkyyden puute ja virheellisiä positiivisia tuloksia mittalaitteen, sirujen tai liuottimien likaantuminen.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

The new paradigm of connectedness and empowerment brought by the interactivity feature of the Web 2.0 has been challenging the traditional centralized performance of mainstream media. The corporation has been able to survive the strong winds by transforming itself into a global multimedia business network embedded in the network society. By establishing networks, e.g. networks of production and distribution, the global multimedia business network has been able to sight potential solutions by opening the doors to innovation in a decentralized and flexible manner. Under this emerging context of re-organization, traditional practices like sourcing need to be re- explained and that is precisely what this thesis attempts to tackle. Based on ICT and on the network society, the study seeks to explain within the Finnish context the particular case of Helsingin Sanomat (HS) and its relations with the youth news agency, Youth Voice Editorial Board (NÄT). In that sense, the study can be regarded as an explanatory embedded single case study, where HS is the principal unit of analysis and NÄT its embedded unit of analysis. The thesis was able to reach explanations through interrelated steps. First, it determined the role of ICT in HS’s sourcing practices. Then it mapped an overview of the HS’s sourcing relations and provided a context in which NÄT was located. And finally, it established conceptualized institutional relational data between HS and NÄT for their posterior measurement through social network analysis. The data set was collected via qualitative interviews addressed to online and offline editors of HS as well as interviews addressed to NÄT’s personnel. The study concluded that ICT’s interactivity and User Generated Content (UGC) are not sourcing tools as such but mechanism used by HS for getting ideas that could turn into potential news stories. However, when it comes to visual communication, some exemptions were found. The lack of official sources amidst the immediacy leads HS to rely on ICT’s interaction and UGC. More than meets the eye, ICT’s input into the sourcing practice may be more noticeable if the interaction and UGC is well organized and coordinated into proper and innovative networks of alternative content collaboration. Currently, HS performs this sourcing practice via two projects that differ, precisely, by the mode they are coordinated. The first project found, Omakaupunki, is coordinated internally by Sanoma Group’s owned media houses HS, Vartti and Metro. The second project found is coordinated externally. The external alternative sourcing network, as it was labeled, consists of three actors, namely HS, NÄT (professionals in charge) and the youth. This network is a balanced and complete triad in which the actors connect themselves in relations of feedback, recognition, creativity and filtering. However, as innovation is approached very reluctantly, this content collaboration is a laboratory of experiments; a ‘COLLABORATORY’.