931 resultados para Bibliographical Database – Aleph 500
Resumo:
S’arriba a un acord entre el grup de recerca GREFEMA i ViCOROB per estudiar els propulsors de palesutilitzats fins a l’actualitat en el robot submarí Girona 500, de forma que el model creatserveixi d’eina per apoder estudiar qualsevol tipus de propulsor que es vulgui fer servir.Es crearà un model de simulació amb CFD d’ANSYS per tal de poder recrear qualsevol situació ambqualsevol model de propulsor que es vulgui emprar, estalviant en costos de compra o fabricació, a mésd’evitar un muntatge experimental que pot no ser del tot fiable.A partir de geometries de propulsors de pales comercials existents es realitzarà una simulació amb elprograma de dinàmica de fluids computacional (CFD) d’ANSYS.La informació proporcionada per l’eina de simulació es compararan amb els resultats obtinguts de formaempírica a les instal•lacions del Parc Científic i Tecnològic de la Universitat de Girona i amb el model teòric.D’aquesta forma, es comprovarà la bondat de la simulació i es validarà el model numèric utilitzat
Resumo:
The purpose of this investigation was to evaluate the Compensatory Wetland Mitigation Program at the Iowa Department of Transportation (DOT) in terms of regulatory compliance. Specific objectives included: 1) Determining if study sites meet the definition of a jurisdictional wetland. 2) Determining the degree of compliance with requirements specified in Clean Water Act Section 404 permits. A total of 24 study sites, in four age classes were randomly selected from over 80 sites currently managed by the Iowa DOT. Wetland boundaries were delineated in the field and mitigation compliance was determined by comparing the delineated wetland acreage at each study site to the total wetland acreage requirements specified in individual CWA Section 404 permits. Of the 24 sites evaluated in this study, 58 percent meet or exceed Section 404 permit requirements. Net gain ranged from 0.19 acre to 27.2 acres. Net loss ranged from 0.2 acre to 14.6 acres. The Denver Bypass 1 site was the worst performer, with zero acres of wetland present on the site and the Akron Wetland Mitigation Site was the best performer with slightly more than 27 acres over the permit requirement. Five of the 10 under-performing sites are more than five years post construction, two are five years post construction, one is three years post construction and the remaining two are one year post construction. Of the sites that meet or exceed permit requirements, approximately 93 percent are five years or less post construction and approximately 43 percent are only one year old. Only one of the 14 successful sites is more than five years old. Using Section 404 permit acreage requirements as the criteria for measuring success, 58 percent of the wetland mitigation sites investigated as part of this study are successful. Using net gain/loss as the measure of success, the Compensatory Wetland Mitigation Program has been successful in creating/restoring nearly 44 acres of wetland over what was required by permits.
Resumo:
For well over 100 years, the Working Stress Design (WSD) approach has been the traditional basis for geotechnical design with regard to settlements or failure conditions. However, considerable effort has been put forth over the past couple of decades in relation to the adoption of the Load and Resistance Factor Design (LRFD) approach into geotechnical design. With the goal of producing engineered designs with consistent levels of reliability, the Federal Highway Administration (FHWA) issued a policy memorandum on June 28, 2000, requiring all new bridges initiated after October 1, 2007, to be designed according to the LRFD approach. Likewise, regionally calibrated LRFD resistance factors were permitted by the American Association of State Highway and Transportation Officials (AASHTO) to improve the economy of bridge foundation elements. Thus, projects TR-573, TR-583 and TR-584 were undertaken by a research team at Iowa State University’s Bridge Engineering Center with the goal of developing resistance factors for pile design using available pile static load test data. To accomplish this goal, the available data were first analyzed for reliability and then placed in a newly designed relational database management system termed PIle LOad Tests (PILOT), to which this first volume of the final report for project TR-573 is dedicated. PILOT is an amalgamated, electronic source of information consisting of both static and dynamic data for pile load tests conducted in the State of Iowa. The database, which includes historical data on pile load tests dating back to 1966, is intended for use in the establishment of LRFD resistance factors for design and construction control of driven pile foundations in Iowa. Although a considerable amount of geotechnical and pile load test data is available in literature as well as in various State Department of Transportation files, PILOT is one of the first regional databases to be exclusively used in the development of LRFD resistance factors for the design and construction control of driven pile foundations. Currently providing an electronically organized assimilation of geotechnical and pile load test data for 274 piles of various types (e.g., steel H-shaped, timber, pipe, Monotube, and concrete), PILOT (http://srg.cce.iastate.edu/lrfd/) is on par with such familiar national databases used in the calibration of LRFD resistance factors for pile foundations as the FHWA’s Deep Foundation Load Test Database. By narrowing geographical boundaries while maintaining a high number of pile load tests, PILOT exemplifies a model for effective regional LRFD calibration procedures.
Resumo:
In the context of recent attempts to redefine the 'skin notation' concept, a position paper summarizing an international workshop on the topic stated that the skin notation should be a hazard indicator related to the degree of toxicity and the potential for transdermal exposure of a chemical. Within the framework of developing a web-based tool integrating this concept, we constructed a database of 7101 agents for which a percutaneous permeation constant can be estimated (using molecular weight and octanol-water partition constant), and for which at least one of the following toxicity indices could be retrieved: Inhalation occupational exposure limit (n=644), Oral lethal dose 50 (LD50, n=6708), cutaneous LD50 (n=1801), Oral no observed adverse effect level (NOAEL, n=1600), and cutaneous NOAEL (n=187). Data sources included the Registry of toxic effects of chemical substances (RTECS, MDL information systems, Inc.), PHYSPROP (Syracuse Research Corp.) and safety cards from the International Programme on Chemical Safety (IPCS). A hazard index, which corresponds to the product of exposure duration and skin surface exposed that would yield an internal dose equal to a toxic reference dose was calculated. This presentation provides a descriptive summary of the database, correlations between toxicity indices, and an example of how the web tool will help industrial hygienist decide on the possibility of a dermal risk using the hazard index.
Resumo:
Abstract
Resumo:
The objective of this work was to evaluate the performance of 15 clones of the IAC 500 series of Hevea brasiliensis, developed at Instituto Agronômico (IAC), over a 12-year period, in the northwest region of São Paulo State, Brazil. The 15 new clones evaluated are primary clones obtained from selected ortets within half-sib progenies. The clone RRIM 600, of Malaysian origin, was used as the control. Dry rubber yield performance over a four-year period, mean girth at the tenth year, girth increment before and during tapping, thermal properties of the natural rubber produced and other characters of the laticiferous system were evaluated. Forty percent of the clones were superior in comparison to the control for yield. Clone IAC 500 recorded the highest yield (66.81 g per tree per tapping) over four years of tapping, followed by IAC 502 (62.37 g per tree per tapping), whereas the control recorded 48.71 g per tree per tapping. All selected clones were vigorous in growth. The natural rubber from this IAC clones showed thermal stability up to 300ºC. No differences were observed in the thermal behavior of rubber among the IAC series and the RRIM 600 clones. The clones IAC 500, IAC 501, IAC 502, IAC 503 and IAC 506 are the more promising for small-scale plantations, due to growth and yield potential.
Resumo:
The Quaternary Active Faults Database of Iberia (QAFI) is an initiative lead by the Institute of Geology and Mines of Spain (IGME) for building a public repository of scientific data regarding faults having documented activity during the last 2.59 Ma (Quaternary). QAFI also addresses a need to transfer geologic knowledge to practitioners of seismic hazard and risk in Iberia by identifying and characterizing seismogenic fault-sources. QAFI is populated by the information freely provided by more than 40 Earth science researchers, storing to date a total of 262 records. In this article we describe the development and evolution of the database, as well as its internal architecture. Aditionally, a first global analysis of the data is provided with a special focus on length and slip-rate fault parameters. Finally, the database completeness and the internal consistency of the data are discussed. Even though QAFI v.2.0 is the most current resource for calculating fault-related seismic hazard in Iberia, the database is still incomplete and requires further review.
Resumo:
Diplomityössä on tutkittu reaaliaikaisen toimintolaskennan toteuttamista suomalaisen lasersiruja valmistavan PK-yrityksen tietojärjestelmään. Lisäksi on tarkasteltu toimintolaskennan vaikutuksia operatiiviseen toimintaan sekä toimintojen johtamiseen. Työn kirjallisuusosassa on käsitelty kirjallisuuslähteiden perusteella toimintolaskennan teorioita, laskentamenetelmiä sekä teknisessä toteutuksessa käytettyjä teknologioita. Työn toteutusosassa suunniteltiin ja toteutettiin WWW-pohjainen toimintolaskentajärjestelmä case-yrityksen kustannuslaskennan sekä taloushallinnon avuksi. Työkalu integroitiin osaksi yrityksen toiminnanohjaus- sekä valmistuksenohjausjärjestelmää. Perinteisiin toimintolaskentamallien tiedonkeruujärjestelmiin verrattuna case-yrityksessä syötteet toimintolaskentajärjestelmälle tulevat reaaliaikaisesti osana suurempaa tietojärjestelmäintegraatiota.Diplomityö pyrkii luomaan suhteen toimintolaskennan vaatimusten ja tietokantajärjestelmien välille. Toimintolaskentajärjestelmää yritys voi hyödyntää esimerkiksi tuotteiden hinnoittelussa ja kustannuslaskennassa näkemällä tuotteisiin liittyviä kustannuksia eri näkökulmista. Päätelmiä voidaan tehdä tarkkaan kustannusinformaatioon perustuen sekä määrittää järjestelmän tuottaman datan perusteella, onko tietyn projektin, asiakkuuden tai tuotteen kehittäminen taloudellisesti kannattavaa.
Resumo:
Résumé: L'automatisation du séquençage et de l'annotation des génomes, ainsi que l'application à large échelle de méthodes de mesure de l'expression génique, génèrent une quantité phénoménale de données pour des organismes modèles tels que l'homme ou la souris. Dans ce déluge de données, il devient très difficile d'obtenir des informations spécifiques à un organisme ou à un gène, et une telle recherche aboutit fréquemment à des réponses fragmentées, voir incomplètes. La création d'une base de données capable de gérer et d'intégrer aussi bien les données génomiques que les données transcriptomiques peut grandement améliorer la vitesse de recherche ainsi que la qualité des résultats obtenus, en permettant une comparaison directe de mesures d'expression des gènes provenant d'expériences réalisées grâce à des techniques différentes. L'objectif principal de ce projet, appelé CleanEx, est de fournir un accès direct aux données d'expression publiques par le biais de noms de gènes officiels, et de représenter des données d'expression produites selon des protocoles différents de manière à faciliter une analyse générale et une comparaison entre plusieurs jeux de données. Une mise à jour cohérente et régulière de la nomenclature des gènes est assurée en associant chaque expérience d'expression de gène à un identificateur permanent de la séquence-cible, donnant une description physique de la population d'ARN visée par l'expérience. Ces identificateurs sont ensuite associés à intervalles réguliers aux catalogues, en constante évolution, des gènes d'organismes modèles. Cette procédure automatique de traçage se fonde en partie sur des ressources externes d'information génomique, telles que UniGene et RefSeq. La partie centrale de CleanEx consiste en un index de gènes établi de manière hebdomadaire et qui contient les liens à toutes les données publiques d'expression déjà incorporées au système. En outre, la base de données des séquences-cible fournit un lien sur le gène correspondant ainsi qu'un contrôle de qualité de ce lien pour différents types de ressources expérimentales, telles que des clones ou des sondes Affymetrix. Le système de recherche en ligne de CleanEx offre un accès aux entrées individuelles ainsi qu'à des outils d'analyse croisée de jeux de donnnées. Ces outils se sont avérés très efficaces dans le cadre de la comparaison de l'expression de gènes, ainsi que, dans une certaine mesure, dans la détection d'une variation de cette expression liée au phénomène d'épissage alternatif. Les fichiers et les outils de CleanEx sont accessibles en ligne (http://www.cleanex.isb-sib.ch/). Abstract: The automatic genome sequencing and annotation, as well as the large-scale gene expression measurements methods, generate a massive amount of data for model organisms. Searching for genespecific or organism-specific information througout all the different databases has become a very difficult task, and often results in fragmented and unrelated answers. The generation of a database which will federate and integrate genomic and transcriptomic data together will greatly improve the search speed as well as the quality of the results by allowing a direct comparison of expression results obtained by different techniques. The main goal of this project, called the CleanEx database, is thus to provide access to public gene expression data via unique gene names and to represent heterogeneous expression data produced by different technologies in a way that facilitates joint analysis and crossdataset comparisons. A consistent and uptodate gene nomenclature is achieved by associating each single gene expression experiment with a permanent target identifier consisting of a physical description of the targeted RNA population or the hybridization reagent used. These targets are then mapped at regular intervals to the growing and evolving catalogues of genes from model organisms, such as human and mouse. The completely automatic mapping procedure relies partly on external genome information resources such as UniGene and RefSeq. The central part of CleanEx is a weekly built gene index containing crossreferences to all public expression data already incorporated into the system. In addition, the expression target database of CleanEx provides gene mapping and quality control information for various types of experimental resources, such as cDNA clones or Affymetrix probe sets. The Affymetrix mapping files are accessible as text files, for further use in external applications, and as individual entries, via the webbased interfaces . The CleanEx webbased query interfaces offer access to individual entries via text string searches or quantitative expression criteria, as well as crossdataset analysis tools, and crosschip gene comparison. These tools have proven to be very efficient in expression data comparison and even, to a certain extent, in detection of differentially expressed splice variants. The CleanEx flat files and tools are available online at: http://www.cleanex.isbsib. ch/.
Resumo:
1. Introduction "The one that has compiled ... a database, the collection, securing the validity or presentation of which has required an essential investment, has the sole right to control the content over the whole work or over either a qualitatively or quantitatively substantial part of the work both by means of reproduction and by making them available to the public", Finnish Copyright Act, section 49.1 These are the laconic words that implemented the much-awaited and hotly debated European Community Directive on the legal protection of databases,2 the EDD, into Finnish Copyright legislation in 1998. Now in the year 2005, after more than half a decade of the domestic implementation it is yet uncertain as to the proper meaning and construction of the convoluted qualitative criteria the current legislation employs as a prerequisite for the database protection both in Finland and within the European Union. Further, this opaque Pan-European instrument has the potential of bringing about a number of far-reaching economic and cultural ramifications, which have remained largely uncharted or unobserved. Thus the task of understanding this particular and currently peculiarly European new intellectual property regime is twofold: first, to understand the mechanics and functioning of the EDD and second, to realise the potential and risks inherent in the new legislation in economic, cultural and societal dimensions. 2. Subject-matter of the study: basic issues The first part of the task mentioned above is straightforward: questions such as what is meant by the key concepts triggering the functioning of the EDD such as presentation of independent information, what constitutes an essential investment in acquiring data and when the reproduction of a given database reaches either qualitatively or quantitatively the threshold of substantiality before the right-holder of a database can avail himself of the remedies provided by the statutory framework remain unclear and call for a careful analysis. As for second task, it is already obvious that the practical importance of the legal protection providedby the database right is in the rapid increase. The accelerating transformationof information into digital form is an existing fact, not merely a reflection of a shape of things to come in the future. To take a simple example, the digitisation of a map, traditionally in paper format and protected by copyright, can provide the consumer a markedly easier and faster access to the wanted material and the price can be, depending on the current state of the marketplace, cheaper than that of the traditional form or even free by means of public lending libraries providing access to the information online. This also renders it possible for authors and publishers to make available and sell their products to markedly larger, international markets while the production and distribution costs can be kept at minimum due to the new electronic production, marketing and distributionmechanisms to mention a few. The troublesome side is for authors and publishers the vastly enhanced potential for illegal copying by electronic means, producing numerous virtually identical copies at speed. The fear of illegal copying canlead to stark technical protection that in turn can dampen down the demand for information goods and services and furthermore, efficiently hamper the right of access to the materials available lawfully in electronic form and thus weaken the possibility of access to information, education and the cultural heritage of anation or nations, a condition precedent for a functioning democracy. 3. Particular issues in Digital Economy and Information Networks All what is said above applies a fortiori to the databases. As a result of the ubiquity of the Internet and the pending breakthrough of Mobile Internet, peer-to-peer Networks, Localand Wide Local Area Networks, a rapidly increasing amount of information not protected by traditional copyright, such as various lists, catalogues and tables,3previously protected partially by the old section 49 of the Finnish Copyright act are available free or for consideration in the Internet, and by the same token importantly, numerous databases are collected in order to enable the marketing, tendering and selling products and services in above mentioned networks. Databases and the information embedded therein constitutes a pivotal element in virtually any commercial operation including product and service development, scientific research and education. A poignant but not instantaneously an obvious example of this is a database consisting of physical coordinates of a certain selected group of customers for marketing purposes through cellular phones, laptops and several handheld or vehicle-based devices connected online. These practical needs call for answer to a plethora of questions already outlined above: Has thecollection and securing the validity of this information required an essential input? What qualifies as a quantitatively or qualitatively significant investment? According to the Directive, the database comprises works, information and other independent materials, which are arranged in systematic or methodical way andare individually accessible by electronic or other means. Under what circumstances then, are the materials regarded as arranged in systematic or methodical way? Only when the protected elements of a database are established, the question concerning the scope of protection becomes acute. In digital context, the traditional notions of reproduction and making available to the public of digital materials seem to fit ill or lead into interpretations that are at variance with analogous domain as regards the lawful and illegal uses of information. This may well interfere with or rework the way in which the commercial and other operators have to establish themselves and function in the existing value networks of information products and services. 4. International sphere After the expiry of the implementation period for the European Community Directive on legal protection of databases, the goals of the Directive must have been consolidated into the domestic legislations of the current twenty-five Member States within the European Union. On one hand, these fundamental questions readily imply that the problemsrelated to correct construction of the Directive underlying the domestic legislation transpire the national boundaries. On the other hand, the disputes arisingon account of the implementation and interpretation of the Directive on the European level attract significance domestically. Consequently, the guidelines on correct interpretation of the Directive importing the practical, business-oriented solutions may well have application on European level. This underlines the exigency for a thorough analysis on the implications of the meaning and potential scope of Database protection in Finland and the European Union. This position hasto be contrasted with the larger, international sphere, which in early 2005 does differ markedly from European Union stance, directly having a negative effect on international trade particularly in digital content. A particular case in point is the USA, a database producer primus inter pares, not at least yet having aSui Generis database regime or its kin, while both the political and academic discourse on the matter abounds. 5. The objectives of the study The above mentioned background with its several open issues calls for the detailed study of thefollowing questions: -What is a database-at-law and when is a database protected by intellectual property rights, particularly by the European database regime?What is the international situation? -How is a database protected and what is its relation with other intellectual property regimes, particularly in the Digital context? -The opportunities and threats provided by current protection to creators, users and the society as a whole, including the commercial and cultural implications? -The difficult question on relation of the Database protection and protection of factual information as such. 6. Dsiposition The Study, in purporting to analyse and cast light on the questions above, is divided into three mainparts. The first part has the purpose of introducing the political and rationalbackground and subsequent legislative evolution path of the European database protection, reflected against the international backdrop on the issue. An introduction to databases, originally a vehicle of modern computing and information andcommunication technology, is also incorporated. The second part sets out the chosen and existing two-tier model of the database protection, reviewing both itscopyright and Sui Generis right facets in detail together with the emergent application of the machinery in real-life societal and particularly commercial context. Furthermore, a general outline of copyright, relevant in context of copyright databases is provided. For purposes of further comparison, a chapter on the precursor of Sui Generi, database right, the Nordic catalogue rule also ensues. The third and final part analyses the positive and negative impact of the database protection system and attempts to scrutinize the implications further in the future with some caveats and tentative recommendations, in particular as regards the convoluted issue concerning the IPR protection of information per se, a new tenet in the domain of copyright and related rights.
Resumo:
Analyzing the type and frequency of patient-specific mutations that give rise to Duchenne muscular dystrophy (DMD) is an invaluable tool for diagnostics, basic scientific research, trial planning, and improved clinical care. Locus-specific databases allow for the collection, organization, storage, and analysis of genetic variants of disease. Here, we describe the development and analysis of the TREAT-NMD DMD Global database (http://umd.be/TREAT_DMD/). We analyzed genetic data for 7,149 DMD mutations held within the database. A total of 5,682 large mutations were observed (80% of total mutations), of which 4,894 (86%) were deletions (1 exon or larger) and 784 (14%) were duplications (1 exon or larger). There were 1,445 small mutations (smaller than 1 exon, 20% of all mutations), of which 358 (25%) were small deletions and 132 (9%) small insertions and 199 (14%) affected the splice sites. Point mutations totalled 756 (52% of small mutations) with 726 (50%) nonsense mutations and 30 (2%) missense mutations. Finally, 22 (0.3%) mid-intronic mutations were observed. In addition, mutations were identified within the database that would potentially benefit from novel genetic therapies for DMD including stop codon read-through therapies (10% of total mutations) and exon skipping therapy (80% of deletions and 55% of total mutations).
Resumo:
BACKGROUND: Cancer mortality statistics for 2015 were projected from the most recent available data for the European Union (EU) and its six more populous countries. Prostate cancer was analysed in detail. PATIENTS AND METHODS: Population and death certification data from stomach, colorectum, pancreas, lung, breast, uterus, prostate, leukaemias and total cancers were obtained from the World Health Organisation database and Eurostat. Figures were derived for the EU, France, Germany, Italy, Poland, Spain and the UK. Projected 2015 numbers of deaths by age group were obtained by linear regression on estimated numbers of deaths over the most recent time period identified by a joinpoint regression model. RESULTS: A total of 1 359 100 cancer deaths are predicted in the EU in 2015 (766 200 men and 592 900 women), corresponding to standardised death rates of 138.4/100 000 men and 83.9/100 000 women, falling 7.5% and 6%, respectively, since 2009. In men, predicted rates for the three major cancers (lung, colorectum and prostate) are lower than in 2009, falling 9%, 5% and 12%. Prostate cancer showed predicted falls of 14%, 17% and 9% in the 35-64, 65-74 and 75+ age groups. In women, breast and colorectal cancers had favourable trends (-10% and -8%), but predicted lung cancer rates rise 9% to 14.24/100 000 becoming the cancer with the highest rate, reaching and possibly overtaking breast cancer rates-though the total number of deaths remain higher for breast (90 800) than lung (87 500). Pancreatic cancer has a negative outlook in both sexes, rising 4% in men and 5% in women between 2009 and 2015. CONCLUSIONS: Cancer mortality predictions for 2015 confirm the overall favourable cancer mortality trend in the EU, translating to an overall 26% fall in men since its peak in 1988, and 21% in women, and the avoidance of over 325 000 deaths in 2015 compared with the peak rate.
Resumo:
Selostus: Angiotensiini I -muuntavaa entsyymiä estävien peptidien aminohapposekvenssien esiintyminen viljan varastoproteiinien rakenteessa
Resumo:
We use interplanetary transport simulations to compute a database of electron Green's functions, i.e., differential intensities resulting at the spacecraft position from an impulsive injection of energetic (>20 keV) electrons close to the Sun, for a large number of values of two standard interplanetary transport parameters: the scattering mean free path and the solar wind speed. The nominal energy channels of the ACE, STEREO, and Wind spacecraft have been used in the interplanetary transport simulations to conceive a unique tool for the study of near-relativistic electron events observed at 1 AU. In this paper, we quantify the characteristic times of the Green's functions (onset and peak time, rise and decay phase duration) as a function of the interplanetary transport conditions. We use the database to calculate the FWHM of the pitch-angle distributions at different times of the event and under different scattering conditions. This allows us to provide a first quantitative result that can be compared with observations, and to assess the validity of the frequently used term beam-like pitch-angle distribution.