19 resultados para Interfaces sonoras

em Helda - Digital Repository of University of Helsinki


Relevância:

20.00% 20.00%

Publicador:

Resumo:

Reuse of existing carefully designed and tested software improves the quality of new software systems and reduces their development costs. Object-oriented frameworks provide an established means for software reuse on the levels of both architectural design and concrete implementation. Unfortunately, due to frame-works complexity that typically results from their flexibility and overall abstract nature, there are severe problems in using frameworks. Patterns are generally accepted as a convenient way of documenting frameworks and their reuse interfaces. In this thesis it is argued, however, that mere static documentation is not enough to solve the problems related to framework usage. Instead, proper interactive assistance tools are needed in order to enable system-atic framework-based software production. This thesis shows how patterns that document a framework s reuse interface can be represented as dependency graphs, and how dynamic lists of programming tasks can be generated from those graphs to assist the process of using a framework to build an application. This approach to framework specialization combines the ideas of framework cookbooks and task-oriented user interfaces. Tasks provide assistance in (1) cre-ating new code that complies with the framework reuse interface specification, (2) assuring the consistency between existing code and the specification, and (3) adjusting existing code to meet the terms of the specification. Besides illustrating how task-orientation can be applied in the context of using frameworks, this thesis describes a systematic methodology for modeling any framework reuse interface in terms of software patterns based on dependency graphs. The methodology shows how framework-specific reuse interface specifi-cations can be derived from a library of existing reusable pattern hierarchies. Since the methodology focuses on reusing patterns, it also alleviates the recog-nized problem of framework reuse interface specification becoming complicated and unmanageable for frameworks of realistic size. The ideas and methods proposed in this thesis have been tested through imple-menting a framework specialization tool called JavaFrames. JavaFrames uses role-based patterns that specify a reuse interface of a framework to guide frame-work specialization in a task-oriented manner. This thesis reports the results of cases studies in which JavaFrames and the hierarchical framework reuse inter-face modeling methodology were applied to the Struts web application frame-work and the JHotDraw drawing editor framework.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Hydrophobins are a group of particularly surface active proteins. The surface activity is demonstrated in the ready adsorption of hydrophobins to hydrophobic/hydrophilic interfaces such as the air/water interface. Adsorbed hydrophobins self-assemble into ordered films, lower the surface tension of water, and stabilize air bubbles and foams. Hydrophobin proteins originate from filamentous fungi. In the fungi the adsorbed hydrophobin films enable the growth of fungal aerial structures, form protective coatings and mediate the attachment of fungi to solid surfaces. This thesis focuses on hydrophobins HFBI, HFBII, and HFBIII from a rot fungus Trichoderma reesei. The self-assembled hydrophobin films were studied both at the air/water interface and on a solid substrate. In particular, using grazing-incidence x-ray diffraction and reflectivity, it was possible to characterize the hydrophobin films directly at the air/water interface. The in situ experiments yielded information on the arrangement of the protein molecules in the films. All the T. reesei hydrophobins were shown to self-assemble into highly crystalline, hexagonally ordered rafts. The thicknesses of these two-dimensional protein crystals were below 30 Å. Similar films were also obtained on silicon substrates. The adsorption of the proteins is likely to be driven by the hydrophobic effect, but the self-assembly into ordered films involves also specific protein-protein interactions. The protein-protein interactions lead to differences in the arrangement of the molecules in the HFBI, HFBII, and HFBIII protein films, as seen in the grazing-incidence x-ray diffraction data. The protein-protein interactions were further probed in solution using small-angle x-ray scattering. Both HFBI and HFBII were shown to form mainly tetramers in aqueous solution. By modifying the solution conditions and thereby the interactions, it was shown that the association was due to the hydrophobic effect. The stable tetrameric assemblies could tolerate heating and changes in pH. The stability of the structure facilitates the persistence of these secreted proteins in the soil.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Distraction in the workplace is increasingly more common in the information age. Several tasks and sources of information compete for a worker's limited cognitive capacities in human-computer interaction (HCI). In some situations even very brief interruptions can have detrimental effects on memory. Nevertheless, in other situations where persons are continuously interrupted, virtually no interruption costs emerge. This dissertation attempts to reveal the mental conditions and causalities differentiating the two outcomes. The explanation, building on the theory of long-term working memory (LTWM; Ericsson and Kintsch, 1995), focuses on the active, skillful aspects of human cognition that enable the storage of task information beyond the temporary and unstable storage provided by short-term working memory (STWM). Its key postulate is called a retrieval structure an abstract, hierarchical knowledge representation built into long-term memory that can be utilized to encode, update, and retrieve products of cognitive processes carried out during skilled task performance. If certain criteria of practice and task processing are met, LTWM allows for the storage of large representations for long time periods, yet these representations can be accessed with the accuracy, reliability, and speed typical of STWM. The main thesis of the dissertation is that the ability to endure interruptions depends on the efficiency in which LTWM can be recruited for maintaing information. An observational study and a field experiment provide ecological evidence for this thesis. Mobile users were found to be able to carry out heavy interleaving and sequencing of tasks while interacting, and they exhibited several intricate time-sharing strategies to orchestrate interruptions in a way sensitive to both external and internal demands. Interruptions are inevitable, because they arise as natural consequences of the top-down and bottom-up control of multitasking. In this process the function of LTWM is to keep some representations ready for reactivation and others in a more passive state to prevent interference. The psychological reality of the main thesis received confirmatory evidence in a series of laboratory experiments. They indicate that after encoding into LTWM, task representations are safeguarded from interruptions, regardless of their intensity, complexity, or pacing. However, when LTWM cannot be deployed, the problems posed by interference in long-term memory and the limited capacity of the STWM surface. A major contribution of the dissertation is the analysis of when users must resort to poorer maintenance strategies, like temporal cues and STWM-based rehearsal. First, one experiment showed that task orientations can be associated with radically different patterns of retrieval cue encodings. Thus the nature of the processing of the interface determines which features will be available as retrieval cues and which must be maintained by other means. In another study it was demonstrated that if the speed of encoding into LTWM, a skill-dependent parameter, is slower than the processing speed allowed for by the task, interruption costs emerge. Contrary to the predictions of competing theories, these costs turned out to involve intrusions in addition to omissions. Finally, it was learned that in rapid visually oriented interaction, perceptual-procedural expectations guide task resumption, and neither STWM nor LTWM are utilized due to the fact that access is too slow. These findings imply a change in thinking about the design of interfaces. Several novel principles of design are presented, basing on the idea of supporting the deployment of LTWM in the main task.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

In visual search one tries to find the currently relevant item among other, irrelevant items. In the present study, visual search performance for complex objects (characters, faces, computer icons and words) was investigated, and the contribution of different stimulus properties, such as luminance contrast between characters and background, set size, stimulus size, colour contrast, spatial frequency, and stimulus layout were investigated. Subjects were required to search for a target object among distracter objects in two-dimensional stimulus arrays. The outcome measure was threshold search time, that is, the presentation duration of the stimulus array required by the subject to find the target with a certain probability. It reflects the time used for visual processing separated from the time used for decision making and manual reactions. The duration of stimulus presentation was controlled by an adaptive staircase method. The number and duration of eye fixations, saccade amplitude, and perceptual span, i.e., the number of items that can be processed during a single fixation, were measured. It was found that search performance was correlated with the number of fixations needed to find the target. Search time and the number of fixations increased with increasing stimulus set size. On the other hand, several complex objects could be processed during a single fixation, i.e., within the perceptual span. Search time and the number of fixations depended on object type as well as luminance contrast. The size of the perceptual span was smaller for more complex objects, and decreased with decreasing luminance contrast within object type, especially for very low contrasts. In addition, the size and shape of perceptual span explained the changes in search performance for different stimulus layouts in word search. Perceptual span was scale invariant for a 16-fold range of stimulus sizes, i.e., the number of items processed during a single fixation was independent of retinal stimulus size or viewing distance. It is suggested that saccadic visual search consists of both serial (eye movements) and parallel (processing within perceptual span) components, and that the size of the perceptual span may explain the effectiveness of saccadic search in different stimulus conditions. Further, low-level visual factors, such as the anatomical structure of the retina, peripheral stimulus visibility and resolution requirements for the identification of different object types are proposed to constrain the size of the perceptual span, and thus, limit visual search performance. Similar methods were used in a clinical study to characterise the visual search performance and eye movements of neurological patients with chronic solvent-induced encephalopathy (CSE). In addition, the data about the effects of different stimulus properties on visual search in normal subjects were presented as simple practical guidelines, so that the limits of human visual perception could be taken into account in the design of user interfaces.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Structuring of the Curriculum Design: Content and Pedagogy Constructing the Whole The object of this qualitative study is to structure curriculum design by drawing from the characteristics of subject content and pedagogy. The aim is to first outline the forms of content and pedagogy within the National Core Curriculum for Basic Education. By analysing these forms I then aim to construct a general view of the curriculum’s structure and its developmental potential as it relates to both current and future pedagogical and intellectual interests. The written curriculum is examined as part of the educational guidance system, which means that it is an administrative and juridical document that governs teacher action and has a pedagogical and intellectual character. Didactical schools, curriculum ideologies and curriculum-determinants are all discussed as means of approaching the curriculum model. Curriculum content itself is defined by the different forms and conceptions of knowledge. The representation of curriculum content can be defined to be either specific or integrated. Curriculum pedagogy is in turn defined on the basis of the prevailing conception of learning and teaching. The pedagogy within the curriculum can be open or closed depending on the extent of pedagogical freedom allowed. An examination of the pedagogical dimension also covers the subject of the interfaces between formal education and informal learning, which must be taken into consideration when developing school pedagogy and therefore also in the curriculum. The data of the study consists of two curriculum documents: The Finnish National Core Curriculum for Basic Education issued in 1994 and the present National core curriculum for basic education issued in 2004. The primary method used in the study is theory-based content analysis. On the one hand the aim of the analysis is to determine if the structure, i.e., model, of the curricula is built from unconnected, self-contained elements, or whether the separate parts make a coherent whole. On the other hand, the aim is also to examine the pedagogical features the two curricula contain. The basis of the study is not the systematic comparison of the curriculum documents, yet an analysis of two very distinct documents must also be based on an examination of their inherent differences. The results of the study show that the content in the analysed documents is not integrated. The boundaries between divisions are clearly defined and the curricula are subject-oriented and based on theoretical propositional knowledge. The pedagogy is mainly closed and based on strong guidance of content, structured student evaluation and measurable learning experiences. However, curriculum documents do have representations of integrated content: the themes covered early on in the core curriculum guidelines of 1994 permeate systematically the different divisions of the curriculum. The core curriculum guidelines of 2004 in turn reveal skills which create connections between subjects. The guidelines’ utilise out-of-school environments and accommodate learner experiences, and focus on flexible studying and emphasize individual learner needs. These characteristics reveal an open form of pedagogy. In light of these results, it is possible to reach an understanding of the content and pedagogical development possibilities of the curriculum. The essential viewpoints are then the setting of thematically-oriented aims as a basis for content development, the curriculum’s pedagogical structuring on the basis of the learning process and the enhancement of connections between curricular content and pedagogy in a purposeful way. Keywords: curriculum, curriculum theory, curriculum design, core curriculum guidelines, teaching content, pedagogy

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Väitöskirjani käsittele mikrobien ja erilaisten kemikaalien rooleja saostumien ja biofilmien muodostumisessa paperi- ja kartonkikoneilla. "Saostuma" tässä työssä tarkoittaa kiinteän aineen kertymää konepinnoille tai rajapinnoille konekierroissa, jotka on tarkoitettu massasulppujen, lietteiden, vesien tai ilman kuljetukseen. Saostumasta tulee "biofilmi" silloin kun sen oleellinen rakennekomponentti on mikrobisolut tai niiden tuotteet. Väitöstyöni työhypoteesina oli, että i. tietämys saostumien koostumuksesta, sekä ii. niiden rakenteesta, biologisista, fysikaalis-kemiallisista ja teknisistä ominaisuuksista ohjaavat tutkijaa löytämään ympäristöä säästäviä keinoja estää epätoivottujen saostumien muodostus tai purkaa jo muodostuneita saostumia. Selvittääkseni saostumien koostumista ja rakennetta käytin monia erilaisia analytiikan työkaluja, kuten elektronimikroskopiaa, konfokaali-laser mikroskopiaa (CLSM), energiadispersiivistä röntgenanalyysiä (EDX), pyrolyysi kaasukromatografiaa yhdistettynä massaspektrometriaan (Py-GCMS), joninvaihtokromatografiaa, kaasukromatografiaa ja mikrobiologisia analyysejä. Osallistuin aktiivisesti innovatiivisen, valon takaisinsirontaan perustuvan sensorin kehittämistyöhön, käytettäväksi biofilmin kasvun mittaukseen suoraan koneen vesikierroista ja säiliöistä. Työni osoitti, että monet paperinvalmistuksessa käytetyistä kemikaaleista reagoivat keskenään tuottaen orgaanisia tahmakerroksia konekiertojen teräspinnoille. Löysin myös kerrostumia, jotka valomikroskooppisessa tarkastelussa oli tulkittu mikrobeiksi, mutta jotka elektronimikroskopia paljasti alunasta syntyneiksi, alumiinihydroksidiksi joka saostui pH:ssa 6,8 kiertokuitua käyttävän koneen viiravesistä. Monet paperintekijät käyttävät vieläkin alunaa kiinnitysaineena vaikka prosessiolot ovat muuttuneet happamista neutraaleiksi. Sitä pidetään paperitekijän "aspiriinina", mutta väitöstutkimukseni osoitti sen riskit. Löysin myös orgaanisia saostumia, joiden alkuperä oli aineiden, kuten pihkan, saippuoituminen (kalsium saippuat) niin että muodostui tahmankasvua ylläpitävä alusta monilla paperi- ja kartonkikoneilla. Näin solumuodoiltaan Deinococcus geothermalista muistuttavia bakteereita kasvamassa lujasti teräskoepalojen pintaan kiinnittyneinä pesäkkeinä, kun koepaloja upotettiin paperikoneiden vesikiertoihin. Nämä deinokokkimaiset pesäkkeet voivat toimia jalustana, tarttumisalustana muiden mikrobien massoille, joka selittäisi miksi saostumat yleisesti sisältävät deinokokkeja pienenä, muttei koskaan pääasiallisena rakenneosana. Kun paperikoneiden käyttämien vesien (raakavedet, lämminvesi, biologisesti puhdistettu jätevesi) laatua tutkitaan, mittausmenetelmällä on suuri merkitys. Koepalan upotusmenetelmällä todettu biofilmikasvu ja viljelmenetelmällä mitattu bakteerisaastuneisuus korreloivat toisiinsa huonosti etenkin silloin kun likaantumisessa oli mukana rihmamaiseti kasvavia bakteereja. Huoli ympäristöstä on pakottanut paperi- ja kartonkikoneiden vesikiertojen sulkemiseen. Vesien kierrätys ja prosessivesien uudelleenkäyttö nostavat prosessilämpötilaa ja lisäävät koneella kiertävien kolloidisten ja liuenneiden aineiden määriä. Tutkin kiertovesien pitoisuuksia kolmessa eriasteisesti suljetussa tehtaassa, joiden päästöt olivat 0 m3, 0,5 m3 ja 4 m3 jätevettä tuotetonnia kohden, perustuen puhdistetun jäteveden uudelleen käyttöön. Nollapäästöisellä tehtaalla kiertovesiin kertyi paljon orgaanisesti sidottua hiiltä (> 10 g L-1), etenkin haihtuvina happoina (maito-, etikka-, propioni- ja voi-). Myös sulfaatteja, klorideja, natriumia ja kalsiumia kertyi paljon, > 1 g L-1 kutakin. Pääosa (>40%) kaikista bakteereista oli 16S rRNA geenisekvenssianalyysien tulosten perusteella sukua, joskin etäistä (< 96%) ainoastaan Enterococcus cecorum bakteerille. 4 m3 päästävältä tehtaalta löytyi lisäksi Bacillus thermoamylovorans ja Bacillus coagulans. Tehtaiden saostumat sisälsivät arkkeja suurina pitoisuuksina, ≥ 108 g-1, mutta tunnistukseen riittävää sekvenssisamanlaisuutta löytyi vain yhteen arkkisukuun, Methanothrix. Tutkimustulokset osoittivat että tehtaan vesikiertojen sulkeminen vähensi rajusti mikrobiston monimuotoisuutta, muttei estänyt liuenneen aineen ja kiintoaineen mineralisoitumista.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

This thesis consists of an introduction, four research articles and an appendix. The thesis studies relations between two different approaches to continuum limit of models of two dimensional statistical mechanics at criticality. The approach of conformal field theory (CFT) could be thought of as the algebraic classification of some basic objects in these models. It has been succesfully used by physicists since 1980's. The other approach, Schramm-Loewner evolutions (SLEs), is a recently introduced set of mathematical methods to study random curves or interfaces occurring in the continuum limit of the models. The first and second included articles argue on basis of statistical mechanics what would be a plausible relation between SLEs and conformal field theory. The first article studies multiple SLEs, several random curves simultaneously in a domain. The proposed definition is compatible with a natural commutation requirement suggested by Dubédat. The curves of multiple SLE may form different topological configurations, ``pure geometries''. We conjecture a relation between the topological configurations and CFT concepts of conformal blocks and operator product expansions. Example applications of multiple SLEs include crossing probabilities for percolation and Ising model. The second article studies SLE variants that represent models with boundary conditions implemented by primary fields. The most well known of these, SLE(kappa, rho), is shown to be simple in terms of the Coulomb gas formalism of CFT. In the third article the space of local martingales for variants of SLE is shown to carry a representation of Virasoro algebra. Finding this structure is guided by the relation of SLEs and CFTs in general, but the result is established in a straightforward fashion. This article, too, emphasizes multiple SLEs and proposes a possible way of treating pure geometries in terms of Coulomb gas. The fourth article states results of applications of the Virasoro structure to the open questions of SLE reversibility and duality. Proofs of the stated results are provided in the appendix. The objective is an indirect computation of certain polynomial expected values. Provided that these expected values exist, in generic cases they are shown to possess the desired properties, thus giving support for both reversibility and duality.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Planar curves arise naturally as interfaces between two regions of the plane. An important part of statistical physics is the study of lattice models. This thesis is about the interfaces of 2D lattice models. The scaling limit is an infinite system limit which is taken by letting the lattice mesh decrease to zero. At criticality, the scaling limit of an interface is one of the SLE curves (Schramm-Loewner evolution), introduced by Oded Schramm. This family of random curves is parametrized by a real variable, which determines the universality class of the model. The first and the second paper of this thesis study properties of SLEs. They contain two different methods to study the whole SLE curve, which is, in fact, the most interesting object from the statistical physics point of view. These methods are applied to study two symmetries of SLE: reversibility and duality. The first paper uses an algebraic method and a representation of the Virasoro algebra to find common martingales to different processes, and that way, to confirm the symmetries for polynomial expected values of natural SLE data. In the second paper, a recursion is obtained for the same kind of expected values. The recursion is based on stationarity of the law of the whole SLE curve under a SLE induced flow. The third paper deals with one of the most central questions of the field and provides a framework of estimates for describing 2D scaling limits by SLE curves. In particular, it is shown that a weak estimate on the probability of an annulus crossing implies that a random curve arising from a statistical physics model will have scaling limits and those will be well-described by Loewner evolutions with random driving forces.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

The ever expanding growth of the wireless access to the Internet in recent years has led to the proliferation of wireless and mobile devices to connect to the Internet. This has created the possibility of mobile devices equipped with multiple radio interfaces to connect to the Internet using any of several wireless access network technologies such as GPRS, WLAN and WiMAX in order to get the connectivity best suited for the application. These access networks are highly heterogeneous and they vary widely in their characteristics such as bandwidth, propagation delay and geographical coverage. The mechanism by which a mobile device switches between these access networks during an ongoing connection is referred to as vertical handoff and it often results in an abrupt and significant change in the access link characteristics. The most common Internet applications such as Web browsing and e-mail make use of the Transmission Control Protocol (TCP) as their transport protocol and the behaviour of TCP depends on the end-to-end path characteristics such as bandwidth and round-trip time (RTT). As the wireless access link is most likely the bottleneck of a TCP end-to-end path, the abrupt changes in the link characteristics due to a vertical handoff may affect TCP behaviour adversely degrading the performance of the application. The focus of this thesis is to study the effect of a vertical handoff on TCP behaviour and to propose algorithms that improve the handoff behaviour of TCP using cross-layer information about the changes in the access link characteristics. We begin this study by identifying the various problems of TCP due to a vertical handoff based on extensive simulation experiments. We use this study as a basis to develop cross-layer assisted TCP algorithms in handoff scenarios involving GPRS and WLAN access networks. We then extend the scope of the study by developing cross-layer assisted TCP algorithms in a broader context applicable to a wide range of bandwidth and delay changes during a handoff. And finally, the algorithms developed here are shown to be easily extendable to the multiple-TCP flow scenario. We evaluate the proposed algorithms by comparison with standard TCP (TCP SACK) and show that the proposed algorithms are effective in improving TCP behavior in vertical handoff involving a wide range of bandwidth and delay of the access networks. Our algorithms are easy to implement in real systems and they involve modifications to the TCP sender algorithm only. The proposed algorithms are conservative in nature and they do not adversely affect the performance of TCP in the absence of cross-layer information.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

In recent years, XML has been widely adopted as a universal format for structured data. A variety of XML-based systems have emerged, most prominently SOAP for Web services, XMPP for instant messaging, and RSS and Atom for content syndication. This popularity is helped by the excellent support for XML processing in many programming languages and by the variety of XML-based technologies for more complex needs of applications. Concurrently with this rise of XML, there has also been a qualitative expansion of the Internet's scope. Namely, mobile devices are becoming capable enough to be full-fledged members of various distributed systems. Such devices are battery-powered, their network connections are based on wireless technologies, and their processing capabilities are typically much lower than those of stationary computers. This dissertation presents work performed to try to reconcile these two developments. XML as a highly redundant text-based format is not obviously suitable for mobile devices that need to avoid extraneous processing and communication. Furthermore, the protocols and systems commonly used in XML messaging are often designed for fixed networks and may make assumptions that do not hold in wireless environments. This work identifies four areas of improvement in XML messaging systems: the programming interfaces to the system itself and to XML processing, the serialization format used for the messages, and the protocol used to transmit the messages. We show a complete system that improves the overall performance of XML messaging through consideration of these areas. The work is centered on actually implementing the proposals in a form usable on real mobile devices. The experimentation is performed on actual devices and real networks using the messaging system implemented as a part of this work. The experimentation is extensive and, due to using several different devices, also provides a glimpse of what the performance of these systems may look like in the future.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Information visualization is a process of constructing a visual presentation of abstract quantitative data. The characteristics of visual perception enable humans to recognize patterns, trends and anomalies inherent in the data with little effort in a visual display. Such properties of the data are likely to be missed in a purely text-based presentation. Visualizations are therefore widely used in contemporary business decision support systems. Visual user interfaces called dashboards are tools for reporting the status of a company and its business environment to facilitate business intelligence (BI) and performance management activities. In this study, we examine the research on the principles of human visual perception and information visualization as well as the application of visualization in a business decision support system. A review of current BI software products reveals that the visualizations included in them are often quite ineffective in communicating important information. Based on the principles of visual perception and information visualization, we summarize a set of design guidelines for creating effective visual reporting interfaces.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

With the proliferation of wireless and mobile devices equipped with multiple radio interfaces to connect to the Internet, vertical handoff involving different wireless access technologies will enable users to get the best of connectivity and service quality during the lifetime of a TCP connection. A vertical handoff may introduce an abrupt, significant change in the access link characteristics and as a result the end-to-end path characteristics such as the bandwidth and the round-trip time (RTT) of a TCP connection may change considerably. TCP may take several RTTs to adapt to these changes in path characteristics and during this interval there may be packet losses and / or inefficient utilization of the available bandwidth. In this thesis we study the behaviour and performance of TCP in the presence of a vertical handoff. We identify the different handoff scenarios that adversely affect TCP performance. We propose several enhancements to the TCP sender algorithm that are specific to the different handoff scenarios to adapt TCP better to a vertical handoff. Our algorithms are conservative in nature and make use of cross-layer information obtained from the lower layers regarding the characteristics of the access links involved in a handoff. We evaluate the proposed algorithms by extensive simulation of the various handoff scenarios involving access links with a wide range of bandwidth and delay. We show that the proposed algorithms are effective in improving the TCP behaviour in various handoff scenarios and do not adversely affect the performance of TCP in the absence of cross-layer information.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Transposons, mobile genetic elements that are ubiquitous in all living organisms have been used as tools in molecular biology for decades. They have the ability to move into discrete DNA locations with no apparent homology to the target site. The utility of transposons as molecular tools is based on their ability to integrate into various DNA sequences efficiently, producing extensive mutant clone libraries that can be used in various molecular biology applications. Bacteriophage Mu is one of the most useful transposons due to its well-characterized and simple in vitro transposition reaction. This study establishes the properties of the Mu in vitro transposition system as a versatile multipurpose tool in molecular biology. In addition, this study describes Mu-based applications for engineering proteins by random insertional transposon mutagenesis in order to study structure-function relationships in proteins. We initially characterized the properties of the minimal Mu in vitro transposition system. We showed that the Mu transposition system works efficiently and accurately and produces insertions into a wide spectrum of target sites in different DNA molecules. Then, we developed a pentapeptide insertion mutagenesis strategy for inserting random five amino acid cassettes into proteins. These protein variants can be used especially for screening important sites for protein-protein interactions. Also, the system may produce temperature-sensitive variants of the protein of interest. Furthermore, we developed an efficient screening system for high-resolution mapping of protein-protein interfaces with the pentapeptide insertion mutagenesis. This was accomplished by combining the mutagenesis with subsequent yeast two-hybrid screening and PCR-based genetic footprinting. This combination allows the analysis of the whole mutant library en masse, without the need for producing or isolating separate mutant clones, and the protein-protein interfaces can be determined at amino acid accuracy. The system was validated by analysing the interacting region of JFC1 with Rab8A, and we show that the interaction is mediated via the JFC1 Slp homology domain. In addition, we developed a procedure for the production of nested sets of N- and C-terminal deletion variants of proteins with the Mu system. These variants are useful in many functional studies of proteins, especially in mapping regions involved in protein-protein interactions. This methodology was validated by analysing the region in yeast Mso1 involved in an interaction with Sec1. The results of this study show that the Mu in vitro transposition system is versatile for various applicational purposes and can efficiently be adapted to random protein engineering applications for functional studies of proteins.