929 resultados para Personal uses of computer


Relevância:

100.00% 100.00%

Publicador:

Resumo:

In this paper, the theory of hidden Markov models (HMM) isapplied to the problem of blind (without training sequences) channel estimationand data detection. Within a HMM framework, the Baum–Welch(BW) identification algorithm is frequently used to find out maximum-likelihood (ML) estimates of the corresponding model. However, such a procedureassumes the model (i.e., the channel response) to be static throughoutthe observation sequence. By means of introducing a parametric model fortime-varying channel responses, a version of the algorithm, which is moreappropriate for mobile channels [time-dependent Baum-Welch (TDBW)] isderived. Aiming to compare algorithm behavior, a set of computer simulationsfor a GSM scenario is provided. Results indicate that, in comparisonto other Baum–Welch (BW) versions of the algorithm, the TDBW approachattains a remarkable enhancement in performance. For that purpose, onlya moderate increase in computational complexity is needed.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

BACKGROUND AND STUDY AIMS: To summarize the published literature on assessment of appropriateness of colonoscopy for screening for colorectal cancer (CRC) in asymptomatic individuals without personal history of CRC or polyps, and report appropriateness criteria developed by an expert panel, the 2008 European Panel on the Appropriateness of Gastrointestinal Endoscopy, EPAGE II. METHODS: A systematic search of guidelines, systematic reviews, and primary studies regarding colonoscopy for screening for colorectal cancer was performed. The RAND/UCLA Appropriateness Method was applied to develop appropriateness criteria for colonoscopy in these circumstances. RESULTS: Available evidence for CRC screening comes from small case-controlled studies, with heterogeneous results, and from indirect evidence from randomized controlled trials (RCTs) on fecal occult blood test (FOBT) screening and studies on flexible sigmoidoscopy screening. Most guidelines recommend screening colonoscopy every 10 years starting at age 50 in average-risk individuals. In individuals with a higher risk of CRC due to family history, there is a consensus that it is appropriate to offer screening colonoscopy at < 50 years. EPAGE II considered screening colonoscopy appropriate above 50 years in average-risk individuals. Panelists deemed screening colonoscopy appropriate for younger patients, with shorter surveillance intervals, where family or personal risk of colorectal cancer is higher. A positive FOBT or the discovery of adenomas at sigmoidoscopy are considered appropriate indications. CONCLUSIONS: Despite the lack of evidence based on randomized controlled trials (RCTs), colonoscopy is recommended by most published guidelines and EPAGE II criteria available online (http://www.epage.ch), as a screening option for CRC in individuals at average risk of CRC, and undisputedly as the main screening tool for CRC in individuals at moderate and high risk of CRC.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Résumé : La radiothérapie par modulation d'intensité (IMRT) est une technique de traitement qui utilise des faisceaux dont la fluence de rayonnement est modulée. L'IMRT, largement utilisée dans les pays industrialisés, permet d'atteindre une meilleure homogénéité de la dose à l'intérieur du volume cible et de réduire la dose aux organes à risque. Une méthode usuelle pour réaliser pratiquement la modulation des faisceaux est de sommer de petits faisceaux (segments) qui ont la même incidence. Cette technique est appelée IMRT step-and-shoot. Dans le contexte clinique, il est nécessaire de vérifier les plans de traitement des patients avant la première irradiation. Cette question n'est toujours pas résolue de manière satisfaisante. En effet, un calcul indépendant des unités moniteur (représentatif de la pondération des chaque segment) ne peut pas être réalisé pour les traitements IMRT step-and-shoot, car les poids des segments ne sont pas connus à priori, mais calculés au moment de la planification inverse. Par ailleurs, la vérification des plans de traitement par comparaison avec des mesures prend du temps et ne restitue pas la géométrie exacte du traitement. Dans ce travail, une méthode indépendante de calcul des plans de traitement IMRT step-and-shoot est décrite. Cette méthode est basée sur le code Monte Carlo EGSnrc/BEAMnrc, dont la modélisation de la tête de l'accélérateur linéaire a été validée dans une large gamme de situations. Les segments d'un plan de traitement IMRT sont simulés individuellement dans la géométrie exacte du traitement. Ensuite, les distributions de dose sont converties en dose absorbée dans l'eau par unité moniteur. La dose totale du traitement dans chaque élément de volume du patient (voxel) peut être exprimée comme une équation matricielle linéaire des unités moniteur et de la dose par unité moniteur de chacun des faisceaux. La résolution de cette équation est effectuée par l'inversion d'une matrice à l'aide de l'algorithme dit Non-Negative Least Square fit (NNLS). L'ensemble des voxels contenus dans le volume patient ne pouvant être utilisés dans le calcul pour des raisons de limitations informatiques, plusieurs possibilités de sélection ont été testées. Le meilleur choix consiste à utiliser les voxels contenus dans le Volume Cible de Planification (PTV). La méthode proposée dans ce travail a été testée avec huit cas cliniques représentatifs des traitements habituels de radiothérapie. Les unités moniteur obtenues conduisent à des distributions de dose globale cliniquement équivalentes à celles issues du logiciel de planification des traitements. Ainsi, cette méthode indépendante de calcul des unités moniteur pour l'IMRT step-andshootest validée pour une utilisation clinique. Par analogie, il serait possible d'envisager d'appliquer une méthode similaire pour d'autres modalités de traitement comme par exemple la tomothérapie. Abstract : Intensity Modulated RadioTherapy (IMRT) is a treatment technique that uses modulated beam fluence. IMRT is now widespread in more advanced countries, due to its improvement of dose conformation around target volume, and its ability to lower doses to organs at risk in complex clinical cases. One way to carry out beam modulation is to sum smaller beams (beamlets) with the same incidence. This technique is called step-and-shoot IMRT. In a clinical context, it is necessary to verify treatment plans before the first irradiation. IMRT Plan verification is still an issue for this technique. Independent monitor unit calculation (representative of the weight of each beamlet) can indeed not be performed for IMRT step-and-shoot, because beamlet weights are not known a priori, but calculated by inverse planning. Besides, treatment plan verification by comparison with measured data is time consuming and performed in a simple geometry, usually in a cubic water phantom with all machine angles set to zero. In this work, an independent method for monitor unit calculation for step-and-shoot IMRT is described. This method is based on the Monte Carlo code EGSnrc/BEAMnrc. The Monte Carlo model of the head of the linear accelerator is validated by comparison of simulated and measured dose distributions in a large range of situations. The beamlets of an IMRT treatment plan are calculated individually by Monte Carlo, in the exact geometry of the treatment. Then, the dose distributions of the beamlets are converted in absorbed dose to water per monitor unit. The dose of the whole treatment in each volume element (voxel) can be expressed through a linear matrix equation of the monitor units and dose per monitor unit of every beamlets. This equation is solved by a Non-Negative Least Sqvare fif algorithm (NNLS). However, not every voxels inside the patient volume can be used in order to solve this equation, because of computer limitations. Several ways of voxel selection have been tested and the best choice consists in using voxels inside the Planning Target Volume (PTV). The method presented in this work was tested with eight clinical cases, which were representative of usual radiotherapy treatments. The monitor units obtained lead to clinically equivalent global dose distributions. Thus, this independent monitor unit calculation method for step-and-shoot IMRT is validated and can therefore be used in a clinical routine. It would be possible to consider applying a similar method for other treatment modalities, such as for instance tomotherapy or volumetric modulated arc therapy.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Over 70% of the total costs of an end product are consequences of decisions that are made during the design process. A search for optimal cross-sections will often have only a marginal effect on the amount of material used if the geometry of a structure is fixed and if the cross-sectional characteristics of its elements are property designed by conventional methods. In recent years, optimalgeometry has become a central area of research in the automated design of structures. It is generally accepted that no single optimisation algorithm is suitable for all engineering design problems. An appropriate algorithm, therefore, mustbe selected individually for each optimisation situation. Modelling is the mosttime consuming phase in the optimisation of steel and metal structures. In thisresearch, the goal was to develop a method and computer program, which reduces the modelling and optimisation time for structural design. The program needed anoptimisation algorithm that is suitable for various engineering design problems. Because Finite Element modelling is commonly used in the design of steel and metal structures, the interaction between a finite element tool and optimisation tool needed a practical solution. The developed method and computer programs were tested with standard optimisation tests and practical design optimisation cases. Three generations of computer programs are developed. The programs combine anoptimisation problem modelling tool and FE-modelling program using three alternate methdos. The modelling and optimisation was demonstrated in the design of a new boom construction and steel structures of flat and ridge roofs. This thesis demonstrates that the most time consuming modelling time is significantly reduced. Modelling errors are reduced and the results are more reliable. A new selection rule for the evolution algorithm, which eliminates the need for constraint weight factors is tested with optimisation cases of the steel structures that include hundreds of constraints. It is seen that the tested algorithm can be used nearly as a black box without parameter settings and penalty factors of the constraints.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The diffusion of mobile telephony began in 1971 in Finland, when the first car phones, called ARP1 were taken to use. Technologies changed from ARP to NMT and later to GSM. The main application of the technology, however, was voice transfer. The birth of the Internet created an open public data network and easy access to other types of computer-based services over networks. Telephones had been used as modems, but the development of the cellular technologies enabled automatic access from mobile phones to Internet. Also other wireless technologies, for instance Wireless LANs, were also introduced. Telephony had developed from analog to digital in fixed networks and allowed easy integration of fixed and mobile networks. This development opened a completely new functionality to computers and mobile phones. It also initiated the merger of the information technology (IT) and telecommunication (TC) industries. Despite the arising opportunity for firms' new competition the applications based on the new functionality were rare. Furthermore, technology development combined with innovation can be disruptive to industries. This research focuses on the new technology's impact on competition in the ICT industry through understanding the strategic needs and alternative futures of the industry's customers. The change speed inthe ICT industry is high and therefore it was valuable to integrate the DynamicCapability view of the firm in this research. Dynamic capabilities are an application of the Resource-Based View (RBV) of the firm. As is stated in the literature, strategic positioning complements RBV. This theoretical framework leads theresearch to focus on three areas: customer strategic innovation and business model development, external future analysis, and process development combining these two. The theoretical contribution of the research is in the development of methodology integrating theories of the RBV, dynamic capabilities and strategic positioning. The research approach has been constructive due to the actual managerial problems initiating the study. The requirement for iterative and innovative progress in the research supported the chosen research approach. The study applies known methods in product development, for instance, innovation process in theGroup Decision Support Systems (GDSS) laboratory and Quality Function Deployment (QFD), and combines them with known strategy analysis tools like industry analysis and scenario method. As the main result, the thesis presents the strategic innovation process, where new business concepts are used to describe the alternative resource configurations and scenarios as alternative competitive environments, which can be a new way for firms to achieve competitive advantage in high-velocity markets. In addition to the strategic innovation process as a result, thestudy has also resulted in approximately 250 new innovations for the participating firms, reduced technology uncertainty and helped strategic infrastructural decisions in the firms, and produced a knowledge-bank including data from 43 ICT and 19 paper industry firms between the years 1999 - 2004. The methods presentedin this research are also applicable to other industries.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

1. Introduction "The one that has compiled ... a database, the collection, securing the validity or presentation of which has required an essential investment, has the sole right to control the content over the whole work or over either a qualitatively or quantitatively substantial part of the work both by means of reproduction and by making them available to the public", Finnish Copyright Act, section 49.1 These are the laconic words that implemented the much-awaited and hotly debated European Community Directive on the legal protection of databases,2 the EDD, into Finnish Copyright legislation in 1998. Now in the year 2005, after more than half a decade of the domestic implementation it is yet uncertain as to the proper meaning and construction of the convoluted qualitative criteria the current legislation employs as a prerequisite for the database protection both in Finland and within the European Union. Further, this opaque Pan-European instrument has the potential of bringing about a number of far-reaching economic and cultural ramifications, which have remained largely uncharted or unobserved. Thus the task of understanding this particular and currently peculiarly European new intellectual property regime is twofold: first, to understand the mechanics and functioning of the EDD and second, to realise the potential and risks inherent in the new legislation in economic, cultural and societal dimensions. 2. Subject-matter of the study: basic issues The first part of the task mentioned above is straightforward: questions such as what is meant by the key concepts triggering the functioning of the EDD such as presentation of independent information, what constitutes an essential investment in acquiring data and when the reproduction of a given database reaches either qualitatively or quantitatively the threshold of substantiality before the right-holder of a database can avail himself of the remedies provided by the statutory framework remain unclear and call for a careful analysis. As for second task, it is already obvious that the practical importance of the legal protection providedby the database right is in the rapid increase. The accelerating transformationof information into digital form is an existing fact, not merely a reflection of a shape of things to come in the future. To take a simple example, the digitisation of a map, traditionally in paper format and protected by copyright, can provide the consumer a markedly easier and faster access to the wanted material and the price can be, depending on the current state of the marketplace, cheaper than that of the traditional form or even free by means of public lending libraries providing access to the information online. This also renders it possible for authors and publishers to make available and sell their products to markedly larger, international markets while the production and distribution costs can be kept at minimum due to the new electronic production, marketing and distributionmechanisms to mention a few. The troublesome side is for authors and publishers the vastly enhanced potential for illegal copying by electronic means, producing numerous virtually identical copies at speed. The fear of illegal copying canlead to stark technical protection that in turn can dampen down the demand for information goods and services and furthermore, efficiently hamper the right of access to the materials available lawfully in electronic form and thus weaken the possibility of access to information, education and the cultural heritage of anation or nations, a condition precedent for a functioning democracy. 3. Particular issues in Digital Economy and Information Networks All what is said above applies a fortiori to the databases. As a result of the ubiquity of the Internet and the pending breakthrough of Mobile Internet, peer-to-peer Networks, Localand Wide Local Area Networks, a rapidly increasing amount of information not protected by traditional copyright, such as various lists, catalogues and tables,3previously protected partially by the old section 49 of the Finnish Copyright act are available free or for consideration in the Internet, and by the same token importantly, numerous databases are collected in order to enable the marketing, tendering and selling products and services in above mentioned networks. Databases and the information embedded therein constitutes a pivotal element in virtually any commercial operation including product and service development, scientific research and education. A poignant but not instantaneously an obvious example of this is a database consisting of physical coordinates of a certain selected group of customers for marketing purposes through cellular phones, laptops and several handheld or vehicle-based devices connected online. These practical needs call for answer to a plethora of questions already outlined above: Has thecollection and securing the validity of this information required an essential input? What qualifies as a quantitatively or qualitatively significant investment? According to the Directive, the database comprises works, information and other independent materials, which are arranged in systematic or methodical way andare individually accessible by electronic or other means. Under what circumstances then, are the materials regarded as arranged in systematic or methodical way? Only when the protected elements of a database are established, the question concerning the scope of protection becomes acute. In digital context, the traditional notions of reproduction and making available to the public of digital materials seem to fit ill or lead into interpretations that are at variance with analogous domain as regards the lawful and illegal uses of information. This may well interfere with or rework the way in which the commercial and other operators have to establish themselves and function in the existing value networks of information products and services. 4. International sphere After the expiry of the implementation period for the European Community Directive on legal protection of databases, the goals of the Directive must have been consolidated into the domestic legislations of the current twenty-five Member States within the European Union. On one hand, these fundamental questions readily imply that the problemsrelated to correct construction of the Directive underlying the domestic legislation transpire the national boundaries. On the other hand, the disputes arisingon account of the implementation and interpretation of the Directive on the European level attract significance domestically. Consequently, the guidelines on correct interpretation of the Directive importing the practical, business-oriented solutions may well have application on European level. This underlines the exigency for a thorough analysis on the implications of the meaning and potential scope of Database protection in Finland and the European Union. This position hasto be contrasted with the larger, international sphere, which in early 2005 does differ markedly from European Union stance, directly having a negative effect on international trade particularly in digital content. A particular case in point is the USA, a database producer primus inter pares, not at least yet having aSui Generis database regime or its kin, while both the political and academic discourse on the matter abounds. 5. The objectives of the study The above mentioned background with its several open issues calls for the detailed study of thefollowing questions: -What is a database-at-law and when is a database protected by intellectual property rights, particularly by the European database regime?What is the international situation? -How is a database protected and what is its relation with other intellectual property regimes, particularly in the Digital context? -The opportunities and threats provided by current protection to creators, users and the society as a whole, including the commercial and cultural implications? -The difficult question on relation of the Database protection and protection of factual information as such. 6. Dsiposition The Study, in purporting to analyse and cast light on the questions above, is divided into three mainparts. The first part has the purpose of introducing the political and rationalbackground and subsequent legislative evolution path of the European database protection, reflected against the international backdrop on the issue. An introduction to databases, originally a vehicle of modern computing and information andcommunication technology, is also incorporated. The second part sets out the chosen and existing two-tier model of the database protection, reviewing both itscopyright and Sui Generis right facets in detail together with the emergent application of the machinery in real-life societal and particularly commercial context. Furthermore, a general outline of copyright, relevant in context of copyright databases is provided. For purposes of further comparison, a chapter on the precursor of Sui Generi, database right, the Nordic catalogue rule also ensues. The third and final part analyses the positive and negative impact of the database protection system and attempts to scrutinize the implications further in the future with some caveats and tentative recommendations, in particular as regards the convoluted issue concerning the IPR protection of information per se, a new tenet in the domain of copyright and related rights.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

A Fundamentals of Computing Theory course involves different topics that are core to the Computer Science curricula and whose level of abstraction makes them difficult both to teach and to learn. Such difficulty stems from the complexity of the abstract notions involved and the required mathematical background. Surveys conducted among our students showed that many of them were applying some theoretical concepts mechanically rather than developing significant learning. This paper shows a number of didactic strategies that we introduced in the Fundamentals of Computing Theory curricula to cope with the above problem. The proposed strategies were based on a stronger use of technology and a constructivist approach. The final goal was to promote more significant learning of the course topics.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Native plants and animals are a natural heritage threatened by one of the six greatest extinction events in Earth's history. Humans, through habitat transformation, exploitation, and species introductions, are driving this extinction event. To turn this tide, Speziale et al. (2014) suggest reducing human dependence on non-native species by increasing the use, harvest, planting, and raising of native species, thereby increasing their cultural and economic value. The search for new or under-appreciated uses of native species is laudable, especially if it helps protect them and contributes to local cultural diversity. Such efforts are arguably an inherent trait of human curiosity and entrepreneurship and are a central platform of popular movements such as slow foods and native gardening. However, Speziale et al.'s hypothesis - that using native species can protect them - is less simple than they suggest. We refute the idea of nativism that underpins Speziale et al.'s proposal and makes it poorly defensible and considered the unaddressed consequences of the proposal for people and for conservation.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

In this commentary, we argue that the term 'prediction' is overly used when in fact, referring to foundational writings of de Finetti, the correspondent term should be inference. In particular, we intend (i) to summarize and clarify relevant subject matter on prediction from established statistical theory, and (ii) point out the logic of this understanding with respect practical uses of the term prediction. Written from an interdisciplinary perspective, associating statistics and forensic science as an example, this discussion also connects to related fields such as medical diagnosis and other areas of application where reasoning based on scientific results is practiced in societal relevant contexts. This includes forensic psychology that uses prediction as part of its vocabulary when dealing with matters that arise in the course of legal proceedings.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

This paper presents the qualitative data collection process aimed at the study of the impactsocial relations and networks have on educational paths of immigrant students. In theframework of a R & D longitudinal study funded by the Ministry of Science and Innovation(2012-2014), the research team tracked the path of 87 immigrant students, from whom only 17successfully achieved the transition through the first and second year of Post-16 Education.A vast range of literature notes that relationships are an important part of migration process andsocial integration analysis, as well as school history in terms of success or failure. Through thefieldwork researchers collect the personal networks of all immigrant students from 3 highschools who were at that time attending last course of compulsory school. The network structureinfluences their social capital and therefore determines the resources, goods and types of supportindividuals can access. All these aspects are influential elements in the configuration anddevelopment of academic trajectories of immigrant students.At the end of the second year of Post-16 Education (two years later), the study captures personalnetworks of these students again, analyses and discusses their evolution and influence on theirpaths through qualitative interviews. Such interviews facilitated the discussion of theirrelationships while providing interesting narratives that are presented in the text. In order to do so, the biographical interpretive narrative method of interviewing is implemented.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Tämän tutkielman tavoitteena oli määrittää uuden markkinan valinnan perusteita teolliselle tuotteelle. Tutkielma keskittyi jo tunnettuihin kansainvälisen markkinavalinnan lähestymistapoihin ja pyrki soveltamaan yhtä menetelmää käytäntöön tutkielman empiria osassa case-tutkimuksen avulla. Tutkimusote oli tutkiva, eksploratiivinen ja perustui sekundääri analyysiin. Käytetyt tiedon lähteet olivat suureksi osin sekundäärisiä tuottaen kvalitatiivista tietoa. Kuitenkin haastatteluita suoritettiin myös. Kattava kirjallisuus katsaus tunnetuista teoreettisista lähestymistavoista kansainväliseen markkinavalintaan oli osa tutkielmaa. Kolme tärkeintä lähestymistapaa esiteltiin tarkemmin. Yksi lähestymistavoista, ei-järjestelmällinen, muodosti viitekehyksen tutkielman empiria-osalle. Empiria pyrki soveltamaan yhtä ei-järjestelmällisen lähestymistavan malleista kansainvälisessä paperiteollisuudessa. Tarkoituksena oli tunnistaa kaikkein houkuttelevimmat maat mahdollisille markkinointitoimenpiteille tuotteen yhdellä loppukäyttöalueella. Tutkielmassa päädyttiin käyttämään ilmastollisia olosuhteita, siipikarjan päälukua sekä siipikarjan kasvuprosenttia suodattimina pyrittäessä vähentämään mahdollisten maiden lukumäärää. Tutkielman empiria-osa kärsi selkeästi relevantin tiedon puutteesta. Siten myös tutkielman reliabiliteetti ja validiteetti voidaan jossain määrin kyseenalaistaa.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

This paper presents the qualitative data collection process aimed at the study of the impactsocial relations and networks have on educational paths of immigrant students. In theframework of a R & D longitudinal study funded by the Ministry of Science and Innovation(2012-2014), the research team tracked the path of 87 immigrant students, from whom only 17successfully achieved the transition through the first and second year of Post-16 Education.A vast range of literature notes that relationships are an important part of migration process andsocial integration analysis, as well as school history in terms of success or failure. Through thefieldwork researchers collect the personal networks of all immigrant students from 3 highschools who were at that time attending last course of compulsory school. The network structureinfluences their social capital and therefore determines the resources, goods and types of supportindividuals can access. All these aspects are influential elements in the configuration anddevelopment of academic trajectories of immigrant students.At the end of the second year of Post-16 Education (two years later), the study captures personalnetworks of these students again, analyses and discusses their evolution and influence on theirpaths through qualitative interviews. Such interviews facilitated the discussion of theirrelationships while providing interesting narratives that are presented in the text. In order to do so, the biographical interpretive narrative method of interviewing is implemented.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Nykyään kolmeen kerrokseen perustuvat client-server –sovellukset ovat suuri kinnostuskohde sekä niiden kehittäjille etta käyttäjille. Tietotekniikan nopean kehityksen ansiosta näillä sovelluksilla on monipuolinen käyttö teollisuuden eri alueilla. Tällä hetkellä on olemassa paljon työkaluja client-server –sovellusten kehittämiseen, jotka myös tyydyttävät asiakkaiden asettamia vaatimuksia. Nämä työkalut eivät kuitenkaan mahdollista joustavaa toimintaa graafisen käyttöliittyman kanssa. Tämä diplomityö käsittelee client-server –sovellusten kehittamistä XML –kielen avulla. Tämä lähestymistapa mahdollistaa client-server –sovellusten rakentamista niin, että niiden graafinen käyttöliittymä ja ulkonäkö olisivat helposti muokattavissa ilman ohjelman ytimen uudelleenkääntämistä. Diplomityö koostuu kahdesta ostasta: teoreettisesta ja käytännöllisestä. Teoreettinen osa antaa yleisen tiedon client-server –arkkitehtuurista ja kuvailee ohjelmistotekniikan pääkohdat. Käytannöllinen osa esittää tulokset, client-server –sovellusten kehittämisteknologian kehittämislähestymistavan XML: ää käyttäen ja tuloksiin johtavat usecase– ja sekvenssidiagrammit. Käytännöllinen osa myos sisältää esimerkit toteutetuista XML-struktuureista, jotka kuvaavat client –sovellusten kuvaruutukaavakkeiden esintymisen ja serverikyselykaaviot.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Tässä diplomityössä käsitellään henkilökohtaisen tiedon saannin kontrollointia ja tiedon kuvaamista. Työn käytännön osuudessa suunniteltiin XML –malli henkilökohtaisen tiedon kuvaamiseen. Henkilökohtaisten tietojen käyttäminen mahdollistaa henkilökohtaisen palvelun tarjoamisen ja myös palvelun automatisoinnin käyttäjälle. Henkilökohtaisen tiedon kuvaaminen on hyvin oleellista, jotta palvelut voivat kysellä ja ymmärtää tietoja. Henkilökohtaiseen tietoon vaikuttaa erilaisia tekijöitä, jotka on myös otettava huomioon tietoa kuvattaessa. Henkilökohtaisen tiedon leviäminen eri palveluiden tarjoajille tuo mukanaan myös riskejä. Henkilökohtaisen tiedon joutuminen väärän henkilön käsiin saattaa aiheuttaa vakaviakin ongelmia tiedon omistajalle. Henkilökohtaisen tiedon turvallisen ja luotettavan käytettävyyden kannalta onkin hyvin oleellista, että käyttäjällä on mahdollisuus kontrolloida kenelle hän haluaa luovuttaa mitäkin tietoa.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

By means of computer simulations and solution of the equations of the mode coupling theory (MCT),we investigate the role of the intramolecular barriers on several dynamic aspects of nonentangled polymers. The investigated dynamic range extends from the caging regime characteristic of glass-formers to the relaxation of the chain Rouse modes. We review our recent work on this question,provide new results, and critically discuss the limitations of the theory. Solutions of the MCT for the structural relaxation reproduce qualitative trends of simulations for weak and moderate barriers. However, a progressive discrepancy is revealed as the limit of stiff chains is approached. This dis-agreement does not seem related with dynamic heterogeneities, which indeed are not enhanced by increasing barrier strength. It is not connected either with the breakdown of the convolution approximation for three-point static correlations, which retains its validity for stiff chains. These findings suggest the need of an improvement of the MCT equations for polymer melts. Concerning the relaxation of the chain degrees of freedom, MCT provides a microscopic basis for time scales from chain reorientation down to the caging regime. It rationalizes, from first principles, the observed deviations from the Rouse model on increasing the barrier strength. These include anomalous scaling of relaxation times, long-time plateaux, and nonmonotonous wavelength dependence of the mode correlators.