925 resultados para database solution
Resumo:
Useiden pitkän kehityskaaren ohjelmistojen ylläpitäminen ja kehittäminen on vaikeaa, sillä niiden dokumentaatio on vajaata tai vanhentunutta. Tässä diplomityössä etsitään ratkaisua tällaisen ohjelmiston ja sen taustalla olevan järjestelmän kuvaukseen. Tavoitteina on tukea nykyisen ohjelmiston ylläpitoa ja uuden työvoiman perehdyttämistä. Tavoitteena on myös pohjustaa uuden korvaavan ohjelmiston suunnittelua kuvaamalla nykyiseen järjestelmään sitoutunutta sovellusalueosaamista. Työssä kehitetään kuvausmenetelmä järjestelmän kuvaamiseen hierarkkisesti laitteistotason yleiskuvauksesta ohjelmiston luokkarakenteeseen sekä toiminnallisuuteen asti. Laite- ja luokkarakennekuvaukset ovat rakenteellisia kuvauksia, joiden tehtävänä on selittää järjestelmän ja sen osien kokoonpano. Toiminnallisuudesta kertovat kuvaukset on toteutettu käyttötapauskuvauksina. Työssä keskityttiin erityisesti kohdejärjestelmän keskeisen ohjelmiston ja tietokannan kuvaamiseen. Ohjelmistosta valittiin tärkeimmät ja eniten sovellusalueen tietotaitoa sisältävät osat, joista työssä luotiin esimerkkikuvaukset. Kuvauksia on kehitettyä menetelmää hyödyntäen helppo laajentaa tarpeiden mukaan paitsi ohjelmiston muihin osiin, myös laitteiston ja järjestelmän kuvaamiseen kokonaisuudessaan syvemmin.
Resumo:
Usean nykypäivän yrityksen tietojärjestelmäinfrastruktuuri on muotoutunut heterogeeniseksi ympäristöksi, jossa eri käyttöjärjestelmä- ja laitealustoilla toimii usean eri valmistajan toimittamia järjestelmiä. Heterogeenisen ympäristön hallitsemiseksi yritykseltä vaaditaan keskitettyä tietovarastoa, johon on tallennettu tietoa käytetystä järjestelmäympäristöstä sekä sen komponenteista. Tähän tarkoitukseen Microsoft toi markkinoille vuonna 1999 Active Directory 2000 -hakemistopalvelun. Heterogeenisessa ympäristössä käyttäjien autentikointi ja auktorisointi on erittäin vaativaa. Pahimmassa tapauksessa käyttäjällä voi olla kymmeniä käyttäjätunnus-salasana-yhdistelmiä yrityksen eri tietojärjestelmiin. Lisäksi jokaisessa tietojärjestelmässäon ylläpidettävä käyttäjäkohtaisia toimintavaltuuksia. Niin käyttäjän kuin ylläpitäjänkin näkökulmasta tällainen skenaario on painajainen. Tässä diplomityössä kartoitetaan mahdollisuuksia Oracle-tietokantojen käyttäjien autentikoinnin sekä auktorisoinnin keskittämiseksi Active Directory -hakemistopalveluun. Työssä tarkastellaan tarkoitukseen soveltuvia valmiita kaupallisia ratkaisuja sekä tutkitaan mahdollisuuksia oman ratkaisumallin toteuttamiseksi umpäristöstä löytyvien ohjelmointirajapintojen avulla.
Resumo:
Uusia keinoja kullan erottamiseksi malmista on etsitty viimeaikoina taloudellisista ja ympäristöllisistä syistä kautta maailman. Syanidointimenetelmä on hallinnut kullan talteenottoayli sata vuotta. Menetelmässä kulta liuotetaan laimeaan syanidiliuokseen, jostase otetaan talteen aktiivihiilen avulla. Syanidin käyttöä pyritään kuitenkin vähentämään sen myrkyllisyyden takia. Lisäksi nykyään louhitaan enenemässä määrin malmia, josta on hankala rikastaa kulta kustannustehokkaasti syanidia käyttäen. Kullan talteenottoa syanidi- ja kloridiliuoksesta on selvitetty kirjallisuuden avulla. Kullan kemiaan liuotuksen aikana on perehdytty ennen kullan talteenottoa aktiivihiilellä. Aktiivihiilen elinkaari kullan adsorbenttinaon käsitelty valmistuksesta hylkäämiseen mukaan lukien hiilen myrkyttyminen prosessissa ja regenerointi. Aktiivi-hiilen käyttäytyminen syanidi- ja kloridiliuoksessa on selvitetty erikseen. Kullan talteenottoa kuparipitoisista malmeista on käsitelty. Kullan talteenottoa kloridiliuoksesta aktiivihiiltä käyttäen on tutkittu kokeellisesti. Pääasialliset tutkimuskohteet ovat adsorption kinetiikka, kuparin vaikutus adsorptioon, aktiivihiilen vaikutus adsorptioonja adsorboituneiden metallien strippaus hiilestä selektiivisesti. Hapettavan stippauksen vaikutus kullan desorptioon hiilestä on tutkittu yksityiskohtaisesti. Kullan erotusmenetelmät kuparimalmista aktiivihiiltä käyttäen on selvitetty diplomityön tulosten pohjalta. Diplomityön keskeisten tulosten perusteella kulta ei välttämättä saostu aktiivihiilen pinnalle kloridiliuoksesta. Havainto varmistettiin ladattujen hiilipartikkelien pyyhkäisyelektronimikroskooppikuvista ja partikkeleille tehdyistä mikroanalyyseistä. Kullan pelkistyminen metalliseksi kullaksi aktiivihiilessä voitaneen välttää käyttämällä erittäin hapettavia olosuhteita. Aktiivihiili ilmeisesti hapettuu näissä olosuhteissa, mikä mahdollistaa kultakloridin adsorboitumisen hiileen.
Resumo:
OBJECTIVES: Resuscitation in severe head injury may be detrimental when given with hypotonic fluids. We evaluated the effects of lactated Ringer's solution (sodium 131 mmol/L, 277 mOsm/L) compared with hypertonic saline (sodium 268 mmol/L, 598 mOsm/L) in severely head-injured children over the first 3 days after injury. DESIGN: An open, randomized, and prospective study. SETTING: A 16-bed pediatric intensive care unit (ICU) (level III) at a university children's hospital. PATIENTS: A total of 35 consecutive children with head injury. INTERVENTIONS: Thirty-two children with Glasgow Coma Scores of <8 were randomly assigned to receive either lactated Ringer's solution (group 1) or hypertonic saline (group 2). Routine care was standardized, and included the following: head positioning at 30 degrees; normothermia (96.8 degrees to 98.6 degrees F [36 degrees to 37 degrees C]); analgesia and sedation with morphine (10 to 30 microg/kg/hr), midazolam (0.2 to 0.3 mg/kg/hr), and phenobarbital; volume-controlled ventilation (PaCO2 of 26.3 to 30 torr [3.5 to 4 kPa]); and optimal oxygenation (PaO2 of 90 to 105 torr [12 to 14 kPa], oxygen saturation of >92%, and hematocrit of >0.30). MEASUREMENTS AND MAIN RESULTS: Mean arterial pressure and intracranial pressure (ICP) were monitored continuously and documented hourly and at every intervention. The means of every 4-hr period were calculated and serum sodium concentrations were measured at the same time. An ICP of 15 mm Hg was treated with a predefined sequence of interventions, and complications were documented. There was no difference with respect to age, male/female ratio, or initial Glasgow Coma Score. In both groups, there was an inverse correlation between serum sodium concentration and ICP (group 1: r = -.13, r2 = .02, p < .03; group 2: r = -.29, r2 = .08, p < .001) that disappeared in group 1 and increased in group 2 (group 1: r = -.08, r2 = .01, NS; group 2: r = -.35, r2 =.12, p < .001). Correlation between serum sodium concentration and cerebral perfusion pressure (CPP) became significant in group 2 after 8 hrs of treatment (r = .2, r2 = .04, p = .002). Over time, ICP and CPP did not significantly differ between the groups. However, to keep ICP at <15 mm Hg, group 2 patients required significantly fewer interventions (p < .02). Group 1 patients received less sodium (8.0 +/- 4.5 vs. 11.5 +/- 5.0 mmol/kg/day, p = .05) and more fluid on day 1 (2850 +/- 1480 vs. 2180 +/- 770 mL/m2, p = .05). They also had a higher frequency of acute respiratory distress syndrome (four vs. 0 patients, p = .1) and more than two complications (six vs. 1 patient, p = .09). Group 2 patients had significantly shorter ICU stay times (11.6 +/- 6.1 vs. 8.0 +/- 2.4 days; p = .04) and shorter mechanical ventilation times (9.5 +/- 6.0 vs. 6.9 +/- 2.2 days; p = .1). The survival rate and duration of hospital stay were similar in both groups. CONCLUSIONS: Treatment of severe head injury with hypertonic saline is superior to that treatment with lactated Ringer's solution. An increase in serum sodium concentrations significantly correlates with lower ICP and higher CPP. Children treated with hypertonic saline require fewer interventions, have fewer complications, and stay a shorter time in the ICU.
Resumo:
Résumé: L'automatisation du séquençage et de l'annotation des génomes, ainsi que l'application à large échelle de méthodes de mesure de l'expression génique, génèrent une quantité phénoménale de données pour des organismes modèles tels que l'homme ou la souris. Dans ce déluge de données, il devient très difficile d'obtenir des informations spécifiques à un organisme ou à un gène, et une telle recherche aboutit fréquemment à des réponses fragmentées, voir incomplètes. La création d'une base de données capable de gérer et d'intégrer aussi bien les données génomiques que les données transcriptomiques peut grandement améliorer la vitesse de recherche ainsi que la qualité des résultats obtenus, en permettant une comparaison directe de mesures d'expression des gènes provenant d'expériences réalisées grâce à des techniques différentes. L'objectif principal de ce projet, appelé CleanEx, est de fournir un accès direct aux données d'expression publiques par le biais de noms de gènes officiels, et de représenter des données d'expression produites selon des protocoles différents de manière à faciliter une analyse générale et une comparaison entre plusieurs jeux de données. Une mise à jour cohérente et régulière de la nomenclature des gènes est assurée en associant chaque expérience d'expression de gène à un identificateur permanent de la séquence-cible, donnant une description physique de la population d'ARN visée par l'expérience. Ces identificateurs sont ensuite associés à intervalles réguliers aux catalogues, en constante évolution, des gènes d'organismes modèles. Cette procédure automatique de traçage se fonde en partie sur des ressources externes d'information génomique, telles que UniGene et RefSeq. La partie centrale de CleanEx consiste en un index de gènes établi de manière hebdomadaire et qui contient les liens à toutes les données publiques d'expression déjà incorporées au système. En outre, la base de données des séquences-cible fournit un lien sur le gène correspondant ainsi qu'un contrôle de qualité de ce lien pour différents types de ressources expérimentales, telles que des clones ou des sondes Affymetrix. Le système de recherche en ligne de CleanEx offre un accès aux entrées individuelles ainsi qu'à des outils d'analyse croisée de jeux de donnnées. Ces outils se sont avérés très efficaces dans le cadre de la comparaison de l'expression de gènes, ainsi que, dans une certaine mesure, dans la détection d'une variation de cette expression liée au phénomène d'épissage alternatif. Les fichiers et les outils de CleanEx sont accessibles en ligne (http://www.cleanex.isb-sib.ch/). Abstract: The automatic genome sequencing and annotation, as well as the large-scale gene expression measurements methods, generate a massive amount of data for model organisms. Searching for genespecific or organism-specific information througout all the different databases has become a very difficult task, and often results in fragmented and unrelated answers. The generation of a database which will federate and integrate genomic and transcriptomic data together will greatly improve the search speed as well as the quality of the results by allowing a direct comparison of expression results obtained by different techniques. The main goal of this project, called the CleanEx database, is thus to provide access to public gene expression data via unique gene names and to represent heterogeneous expression data produced by different technologies in a way that facilitates joint analysis and crossdataset comparisons. A consistent and uptodate gene nomenclature is achieved by associating each single gene expression experiment with a permanent target identifier consisting of a physical description of the targeted RNA population or the hybridization reagent used. These targets are then mapped at regular intervals to the growing and evolving catalogues of genes from model organisms, such as human and mouse. The completely automatic mapping procedure relies partly on external genome information resources such as UniGene and RefSeq. The central part of CleanEx is a weekly built gene index containing crossreferences to all public expression data already incorporated into the system. In addition, the expression target database of CleanEx provides gene mapping and quality control information for various types of experimental resources, such as cDNA clones or Affymetrix probe sets. The Affymetrix mapping files are accessible as text files, for further use in external applications, and as individual entries, via the webbased interfaces . The CleanEx webbased query interfaces offer access to individual entries via text string searches or quantitative expression criteria, as well as crossdataset analysis tools, and crosschip gene comparison. These tools have proven to be very efficient in expression data comparison and even, to a certain extent, in detection of differentially expressed splice variants. The CleanEx flat files and tools are available online at: http://www.cleanex.isbsib. ch/.
Resumo:
Digital services require personal information for a variety of reasons. Due to advances in communication technology, new types of services are evolving alongwith traditional Internet services. Due to the diversity of services, the traditional approaches to personal information handling designed for Internet services are inadequate. Therefore, new approaches are necessary. In this thesis, a solution where personal information is stored in and accessed from the user's mobile device is presented. This approach is called Mobile Electronic Personality (ME). The ME approach is compared to the existing approaches which rely on a database either at a service, a trusted third party or a client program. Various personal information properties are taken into account in the comparison of storage locations. The thesis presents both the internal and the communication architecture of the ME. The internal architecture defines how the information is stored in the mobile device. The communication architecture defines how the information can be accessed by different types of services from the ME. The use of the architecture is described for services in different environments. A simple ME based solution for the authentication of a user is defined. The authentication of service, which is required to protect the privacy of the users is also presented.
Resumo:
Crystal growth is an essential phase in crystallization kinetics. The rate of crystal growth provides significant information for the design and control of crystallization processes; nevertheless, obtaining accurate growth rate data is still challenging due to a number of factors that prevail in crystal growth. In industrial crystallization, crystals are generally grown from multi-componentand multi-particle solutions under complicated hydrodynamic conditions; thus, it is crucial to increase the general understanding of the growth kinetics in these systems. The aim of this work is to develop a model of the crystal growth rate from solution. An extensive literature review of crystal growth focuses on themodelling of growth kinetics and thermodynamics, and new measuring techniques that have been introduced in the field of crystallization. The growth of a singlecrystal is investigated in binary and ternary systems. The binary system consists of potassium dihydrogen phosphate (KDP, crystallizing solute) and water (solvent), and the ternary system includes KDP, water and an organic admixture. The studied admixtures, urea, ethanol and 1-propanol, are employed at relatively highconcentrations (of up to 5.0 molal). The influence of the admixtures on the solution thermodynamics is studied using the Pitzer activity coefficient model. Theprediction method of the ternary solubility in the studied systems is introduced and verified. The growth rate of the KDP (101) face in the studied systems aremeasured in the growth cell as a function of supersaturation, the admixture concentration, the solution velocity over a crystal and temperature. In addition, the surface morphology of the KDP (101) face is studied using ex situ atomic force microscopy (AFM). The crystal growth rate in the ternary systems is modelled on the basis of the two-step growth model that contains the Maxwell-Stefan (MS) equations and a surface-reaction model. This model is used together with measuredcrystal growth rate data to develop a new method for the evaluation of the model parameters. The validation of the model is justified with experiments. The crystal growth rate in an imperfectly mixed suspension crystallizer is investigatedusing computational fluid dynamics (CFD). A solid-liquid suspension flow that includes multi-sized particles is described by the multi-fluid model as well as by a standard k-epsilon turbulence model and an interface momentum transfer model. The local crystal growth rate is determined from calculated flow information in a diffusion-controlled crystal growth regime. The calculated results are evaluated experimentally.
Resumo:
1. Introduction "The one that has compiled ... a database, the collection, securing the validity or presentation of which has required an essential investment, has the sole right to control the content over the whole work or over either a qualitatively or quantitatively substantial part of the work both by means of reproduction and by making them available to the public", Finnish Copyright Act, section 49.1 These are the laconic words that implemented the much-awaited and hotly debated European Community Directive on the legal protection of databases,2 the EDD, into Finnish Copyright legislation in 1998. Now in the year 2005, after more than half a decade of the domestic implementation it is yet uncertain as to the proper meaning and construction of the convoluted qualitative criteria the current legislation employs as a prerequisite for the database protection both in Finland and within the European Union. Further, this opaque Pan-European instrument has the potential of bringing about a number of far-reaching economic and cultural ramifications, which have remained largely uncharted or unobserved. Thus the task of understanding this particular and currently peculiarly European new intellectual property regime is twofold: first, to understand the mechanics and functioning of the EDD and second, to realise the potential and risks inherent in the new legislation in economic, cultural and societal dimensions. 2. Subject-matter of the study: basic issues The first part of the task mentioned above is straightforward: questions such as what is meant by the key concepts triggering the functioning of the EDD such as presentation of independent information, what constitutes an essential investment in acquiring data and when the reproduction of a given database reaches either qualitatively or quantitatively the threshold of substantiality before the right-holder of a database can avail himself of the remedies provided by the statutory framework remain unclear and call for a careful analysis. As for second task, it is already obvious that the practical importance of the legal protection providedby the database right is in the rapid increase. The accelerating transformationof information into digital form is an existing fact, not merely a reflection of a shape of things to come in the future. To take a simple example, the digitisation of a map, traditionally in paper format and protected by copyright, can provide the consumer a markedly easier and faster access to the wanted material and the price can be, depending on the current state of the marketplace, cheaper than that of the traditional form or even free by means of public lending libraries providing access to the information online. This also renders it possible for authors and publishers to make available and sell their products to markedly larger, international markets while the production and distribution costs can be kept at minimum due to the new electronic production, marketing and distributionmechanisms to mention a few. The troublesome side is for authors and publishers the vastly enhanced potential for illegal copying by electronic means, producing numerous virtually identical copies at speed. The fear of illegal copying canlead to stark technical protection that in turn can dampen down the demand for information goods and services and furthermore, efficiently hamper the right of access to the materials available lawfully in electronic form and thus weaken the possibility of access to information, education and the cultural heritage of anation or nations, a condition precedent for a functioning democracy. 3. Particular issues in Digital Economy and Information Networks All what is said above applies a fortiori to the databases. As a result of the ubiquity of the Internet and the pending breakthrough of Mobile Internet, peer-to-peer Networks, Localand Wide Local Area Networks, a rapidly increasing amount of information not protected by traditional copyright, such as various lists, catalogues and tables,3previously protected partially by the old section 49 of the Finnish Copyright act are available free or for consideration in the Internet, and by the same token importantly, numerous databases are collected in order to enable the marketing, tendering and selling products and services in above mentioned networks. Databases and the information embedded therein constitutes a pivotal element in virtually any commercial operation including product and service development, scientific research and education. A poignant but not instantaneously an obvious example of this is a database consisting of physical coordinates of a certain selected group of customers for marketing purposes through cellular phones, laptops and several handheld or vehicle-based devices connected online. These practical needs call for answer to a plethora of questions already outlined above: Has thecollection and securing the validity of this information required an essential input? What qualifies as a quantitatively or qualitatively significant investment? According to the Directive, the database comprises works, information and other independent materials, which are arranged in systematic or methodical way andare individually accessible by electronic or other means. Under what circumstances then, are the materials regarded as arranged in systematic or methodical way? Only when the protected elements of a database are established, the question concerning the scope of protection becomes acute. In digital context, the traditional notions of reproduction and making available to the public of digital materials seem to fit ill or lead into interpretations that are at variance with analogous domain as regards the lawful and illegal uses of information. This may well interfere with or rework the way in which the commercial and other operators have to establish themselves and function in the existing value networks of information products and services. 4. International sphere After the expiry of the implementation period for the European Community Directive on legal protection of databases, the goals of the Directive must have been consolidated into the domestic legislations of the current twenty-five Member States within the European Union. On one hand, these fundamental questions readily imply that the problemsrelated to correct construction of the Directive underlying the domestic legislation transpire the national boundaries. On the other hand, the disputes arisingon account of the implementation and interpretation of the Directive on the European level attract significance domestically. Consequently, the guidelines on correct interpretation of the Directive importing the practical, business-oriented solutions may well have application on European level. This underlines the exigency for a thorough analysis on the implications of the meaning and potential scope of Database protection in Finland and the European Union. This position hasto be contrasted with the larger, international sphere, which in early 2005 does differ markedly from European Union stance, directly having a negative effect on international trade particularly in digital content. A particular case in point is the USA, a database producer primus inter pares, not at least yet having aSui Generis database regime or its kin, while both the political and academic discourse on the matter abounds. 5. The objectives of the study The above mentioned background with its several open issues calls for the detailed study of thefollowing questions: -What is a database-at-law and when is a database protected by intellectual property rights, particularly by the European database regime?What is the international situation? -How is a database protected and what is its relation with other intellectual property regimes, particularly in the Digital context? -The opportunities and threats provided by current protection to creators, users and the society as a whole, including the commercial and cultural implications? -The difficult question on relation of the Database protection and protection of factual information as such. 6. Dsiposition The Study, in purporting to analyse and cast light on the questions above, is divided into three mainparts. The first part has the purpose of introducing the political and rationalbackground and subsequent legislative evolution path of the European database protection, reflected against the international backdrop on the issue. An introduction to databases, originally a vehicle of modern computing and information andcommunication technology, is also incorporated. The second part sets out the chosen and existing two-tier model of the database protection, reviewing both itscopyright and Sui Generis right facets in detail together with the emergent application of the machinery in real-life societal and particularly commercial context. Furthermore, a general outline of copyright, relevant in context of copyright databases is provided. For purposes of further comparison, a chapter on the precursor of Sui Generi, database right, the Nordic catalogue rule also ensues. The third and final part analyses the positive and negative impact of the database protection system and attempts to scrutinize the implications further in the future with some caveats and tentative recommendations, in particular as regards the convoluted issue concerning the IPR protection of information per se, a new tenet in the domain of copyright and related rights.
Resumo:
For a family of reduced games satisfying a monotonicity property, we introduced the reduced equal split-off set, an extension of the equal split-off set (Branzei et. al, 2006), and study its relation with the core. Regardless of the reduction operation we consider, the intersection between both sets is either empty or a singleton containing the lexmax solution (Arin et al., 2008). We also provide a procedure for computing the lexmax solution for a class of games that includes games with large core (Sharkey, 1982). [JEL Classification: C71]
Resumo:
The major task of policy makers and practitioners when confronted with a resource management problem is to decide on the potential solution(s) to adopt from a range of available options. However, this process is unlikely to be successful and cost effective without access to an independently verified and comprehensive available list of options. There is currently burgeoning interest in ecosystem services and quantitative assessments of their importance and value. Recognition of the value of ecosystem services to human well-being represents an increasingly important argument for protecting and restoring the natural environment, alongside the moral and ethical justifications for conservation. As well as understanding the benefits of ecosystem services, it is also important to synthesize the practical interventions that are capable of maintaining and/or enhancing these services. Apart from pest regulation, pollination, and global climate regulation, this type of exercise has attracted relatively little attention. Through a systematic consultation exercise, we identify a candidate list of 296 possible interventions across the main regulating services of air quality regulation, climate regulation, water flow regulation, erosion regulation, water purification and waste treatment, disease regulation, pest regulation, pollination and natural hazard regulation. The range of interventions differs greatly between habitats and services depending upon the ease of manipulation and the level of research intensity. Some interventions have the potential to deliver benefits across a range of regulating services, especially those that reduce soil loss and maintain forest cover. Synthesis and applications: Solution scanning is important for questioning existing knowledge and identifying the range of options available to researchers and practitioners, as well as serving as the necessary basis for assessing cost effectiveness and guiding implementation strategies. We recommend that it become a routine part of decision making in all environmental policy areas.
Resumo:
Analyzing the type and frequency of patient-specific mutations that give rise to Duchenne muscular dystrophy (DMD) is an invaluable tool for diagnostics, basic scientific research, trial planning, and improved clinical care. Locus-specific databases allow for the collection, organization, storage, and analysis of genetic variants of disease. Here, we describe the development and analysis of the TREAT-NMD DMD Global database (http://umd.be/TREAT_DMD/). We analyzed genetic data for 7,149 DMD mutations held within the database. A total of 5,682 large mutations were observed (80% of total mutations), of which 4,894 (86%) were deletions (1 exon or larger) and 784 (14%) were duplications (1 exon or larger). There were 1,445 small mutations (smaller than 1 exon, 20% of all mutations), of which 358 (25%) were small deletions and 132 (9%) small insertions and 199 (14%) affected the splice sites. Point mutations totalled 756 (52% of small mutations) with 726 (50%) nonsense mutations and 30 (2%) missense mutations. Finally, 22 (0.3%) mid-intronic mutations were observed. In addition, mutations were identified within the database that would potentially benefit from novel genetic therapies for DMD including stop codon read-through therapies (10% of total mutations) and exon skipping therapy (80% of deletions and 55% of total mutations).
Resumo:
Selostus: Angiotensiini I -muuntavaa entsyymiä estävien peptidien aminohapposekvenssien esiintyminen viljan varastoproteiinien rakenteessa
Resumo:
We use interplanetary transport simulations to compute a database of electron Green's functions, i.e., differential intensities resulting at the spacecraft position from an impulsive injection of energetic (>20 keV) electrons close to the Sun, for a large number of values of two standard interplanetary transport parameters: the scattering mean free path and the solar wind speed. The nominal energy channels of the ACE, STEREO, and Wind spacecraft have been used in the interplanetary transport simulations to conceive a unique tool for the study of near-relativistic electron events observed at 1 AU. In this paper, we quantify the characteristic times of the Green's functions (onset and peak time, rise and decay phase duration) as a function of the interplanetary transport conditions. We use the database to calculate the FWHM of the pitch-angle distributions at different times of the event and under different scattering conditions. This allows us to provide a first quantitative result that can be compared with observations, and to assess the validity of the frequently used term beam-like pitch-angle distribution.
Resumo:
Viimeaikainen langattomien teknologioiden kehitys ja evoluutio johtaa uusiin mahdollisuuksiin business-to-business-teollisuussovellusten laatimisessa. Tämän työn tavoite on tutkia teknisten puitteiden ja perustan sekä teknologisen ennustamisen prosessia innovatiivisten langattomien sovellusten kehitysprosessissa. Työ keskittyy langattomiin teknologioihin - verkkoihin ja päätelaitteisiin. Työssä selvitetään saatavilla olevia ja tulevia langattomia verkkoteknologioita ja mobiilipäätelaitteita, arvioidaan niiden päätyypit, ominaisuudet, rajoitteet ja kehitystrendit, sekä määritellään pääasialliset tekniset ominaisuudet, jotka on huomioitava luotaessa langatonta ratkaisua. Tämä tieto vedetään yhteen jatkokäyttöä varten langattomien sovellusten päätelaitetietokantaan rakentamisen aikana. Työ tarjoaa kuvauksen päätelaitetietokannan suunnittelusta ja rakentamisesta sekä tutkii tietokantaa innovatiivisen esimerkkisovelluksen - Reaaliaikaisen On-Line Asiakaspalvelun - avulla.