967 resultados para Object oriented database
Resumo:
Résumé La réalisation d'une seconde ligne de métro (M2) dès 2004, passant dans le centre ville de Lausanne, a été l'opportunité de développer une méthodologie concernant des campagnes microgravimétriques dans un environnement urbain perturbé. Les corrections topographiques prennent une dimension particulière dans un tel milieu, car de nombreux objets non géologiques d'origine anthropogénique comme toutes sortes de sous-sols vides viennent perturber les mesures gravimétriques. Les études de génie civil d'avant projet de ce métro nous ont fournis une quantité importante d'informations cadastrales, notamment sur les contours des bâtiments, sur la position prévue du tube du M2, sur des profondeurs de sous-sol au voisinage du tube, mais aussi sur la géologie rencontré le long du corridor du M2 (issue des données lithologiques de forages géotechniques). La planimétrie des sous-sols a été traitée à l'aide des contours des bâtiments dans un SIG (Système d'Information Géographique), alors qu'une enquête de voisinage fut nécessaire pour mesurer la hauteur des sous-sols. Il a été alors possible, à partir d'un MNT (Modèle Numérique de Terrain) existant sur une grille au mètre, de mettre à jour celui ci avec les vides que représentent ces sous-sols. Les cycles de mesures gravimétriques ont été traités dans des bases de données Ac¬cess, pour permettre un plus grand contrôle des données, une plus grande rapidité de traitement, et une correction de relief rétroactive plus facile, notamment lorsque des mises à jour de la topographie ont lieu durant les travaux. Le quartier Caroline (entre le pont Bessières et la place de l'Ours) a été choisi comme zone d'étude. Le choix s'est porté sur ce quartier du fait que, durant ce travail de thèse, nous avions chronologiquement les phases pré et post creusement du tunnel du M2. Cela nous a permis d'effectuer deux campagnes gravimétriques (avant le creu¬sement durant l'été 2005 et après le creusement durant l'été 2007). Ces réitérations nous ont permis de tester notre modélisation du tunnel. En effet, en comparant les mesures des deux campagnes et la réponse gravifique du modèle du tube discrétisé en prismes rectangulaires, nous avons pu valider notre méthode de modélisation. La modélisation que nous avons développée nous permet de construire avec détail la forme de l'objet considéré avec la possibilité de recouper plusieurs fois des interfaces de terrains géologiques et la surface topographique. Ce type de modélisation peut s'appliquer à toutes constructions anthropogéniques de formes linéaires. Abstract The realization of a second underground (M2) in 2004, in downtown Lausanne, was the opportunity to develop a methodology of microgravity in urban environment. Terrain corrections take on special meaning in such environment. Many non-geologic anthropogenic objects like basements act as perturbation of gravity measurements. Civil engineering provided a large amount of cadastral informations, including out¬lines of buildings, M2 tube position, depths of some basements in the vicinity of the M2 corridor, and also on the geology encountered along the M2 corridor (from the lithological data from boreholes). Geometry of basements was deduced from building outlines in a GIS (Geographic Information System). Field investigation was carried out to measure or estimate heights of basements. A DEM (Digital Elevation Model) of the city of Lausanne is updated from voids of basements. Gravity cycles have been processed in Access database, to enable greater control of data, enhance speed processing, and retroactive terrain correction easier, when update of topographic surface are available. Caroline area (between the bridge Saint-Martin and Place de l'Ours) was chosen as the study area. This area was in particular interest because it was before and after digging in this thesis. This allowed us to conduct two gravity surveys (before excavation during summer 2005 and after excavation during summer 2007). These re-occupations enable us to test our modélisation of the tube. Actually, by comparing the difference of measurements between the both surveys and the gravity response of our model (by rectangular prisms), we were able to validate our modeling. The modeling method we developed allows us to construct detailed shape of an object with possibility to cross land geological interfaces and surface topography. This type of modélisation can be applied to all anthropogenic structures.
Resumo:
Information about the genomic coordinates and the sequence of experimentally identified transcription factor binding sites is found scattered under a variety of diverse formats. The availability of standard collections of such high-quality data is important to design, evaluate and improve novel computational approaches to identify binding motifs on promoter sequences from related genes. ABS (http://genome.imim.es/datasets/abs2005/index.html) is a public database of known binding sites identified in promoters of orthologous vertebrate genes that have been manually curated from bibliography. We have annotated 650 experimental binding sites from 68 transcription factors and 100 orthologous target genes in human, mouse, rat or chicken genome sequences. Computational predictions and promoter alignment information are also provided for each entry. A simple and easy-to-use web interface facilitates data retrieval allowing different views of the information. In addition, the release 1.0 of ABS includes a customizable generator of artificial datasets based on the known sites contained in the collection and an evaluation tool to aid during the training and the assessment of motif-finding programs.
Resumo:
In this paper, we present a critical analysis of the state of the art in the definition and typologies of paraphrasing. This analysis shows that there exists no characterization of paraphrasing that is comprehensive, linguistically based and computationally tractable at the same time. The following sets out to define and delimit the concept on the basis of the propositional content. We present a general, inclusive and computationally oriented typology of the linguistic mechanisms that give rise to form variations between paraphrase pairs.
Resumo:
The main goal of CleanEx is to provide access to public gene expression data via unique gene names. A second objective is to represent heterogeneous expression data produced by different technologies in a way that facilitates joint analysis and cross-data set comparisons. A consistent and up-to-date gene nomenclature is achieved by associating each single experiment with a permanent target identifier consisting of a physical description of the targeted RNA population or the hybridization reagent used. These targets are then mapped at regular intervals to the growing and evolving catalogues of human genes and genes from model organisms. The completely automatic mapping procedure relies partly on external genome information resources such as UniGene and RefSeq. The central part of CleanEx is a weekly built gene index containing cross-references to all public expression data already incorporated into the system. In addition, the expression target database of CleanEx provides gene mapping and quality control information for various types of experimental resource, such as cDNA clones or Affymetrix probe sets. The web-based query interfaces offer access to individual entries via text string searches or quantitative expression criteria. CleanEx is accessible at: http://www.cleanex.isb-sib.ch/.
Resumo:
The work presented here is part of a larger study to identify novel technologies and biomarkers for early Alzheimer disease (AD) detection and it focuses on evaluating the suitability of a new approach for early AD diagnosis by non-invasive methods. The purpose is to examine in a pilot study the potential of applying intelligent algorithms to speech features obtained from suspected patients in order to contribute to the improvement of diagnosis of AD and its degree of severity. In this sense, Artificial Neural Networks (ANN) have been used for the automatic classification of the two classes (AD and control subjects). Two human issues have been analyzed for feature selection: Spontaneous Speech and Emotional Response. Not only linear features but also non-linear ones, such as Fractal Dimension, have been explored. The approach is non invasive, low cost and without any side effects. Obtained experimental results were very satisfactory and promising for early diagnosis and classification of AD patients.
Resumo:
Six subject areas prompted the broad field of inquiry of this mission-oriented dust control and surface improvement project for unpaved roads: • DUST--Hundreds of thousands of tons of dust are created annually by vehicles on Iowa's 70,000 miles of unpaved roads and streets. Such dust is often regarded as a nuisance by Iowa's highway engineers. • REGULATIONS--Establishment of "fugitive dust" regulations by the Iowa DEQ in 1971 has created debates, conferences, legal opinions, financial responsibilities, and limited compromises regarding "reasonable precaution" and "ordinary travel," both terms being undefined judgment factors. • THE PUBLIC--Increased awareness by the public that regulations regarding dust do in fact exist creates a discord of telephone calls, petitions, and increasing numbers of legal citations. Both engineers and politicians are frustrated into allowing either the courts or regulatory agencies to resolve what is basically a professional engineering responsibility. • COST--Economics seldom appear as a tenet of regulatory strategies, and in the case of "fugitive dust," four-way struggles often occur between the highway professions, political bodies, regulatory agencies, and the general public as to who is responsible, what can be done, how much it will cost, or why it wasn't done yesterday. • CONFUSION--The engineer lacks authority, and guidelines and specifications to design and construct a low-cost surf acing system are nebulous, i.e., construct something between the present crushed stone/gravel surface and a high-type pavement. • SOLUTION--The engineer must demonstrate that dust control and surface improvement may be engineered at a reasonable cost to the public, so that a higher degree of regulatory responsibility can be vested in engineering solutions.
Resumo:
In the administration, planning, design, and maintenance of road systems, transportation professionals often need to choose between alternatives, justify decisions, evaluate tradeoffs, determine how much to spend, set priorities, assess how well the network meets traveler needs, and communicate the basis for their actions to others. A variety of technical guidelines, tools, and methods have been developed to help with these activities. Such work aids include design criteria guidelines, design exception analysis methods, needs studies, revenue allocation schemes, regional planning guides, designation of minimum standards, sufficiency ratings, management systems, point based systems to determine eligibility for paving, functional classification, and bridge ratings. While such tools play valuable roles, they also manifest a number of deficiencies and are poorly integrated. Design guides tell what solutions MAY be used, they aren't oriented towards helping find which one SHOULD be used. Design exception methods help justify deviation from design guide requirements but omit consideration of important factors. Resource distribution is too often based on dividing up what's available rather than helping determine how much should be spent. Point systems serve well as procedural tools but are employed primarily to justify decisions that have already been made. In addition, the tools aren't very scalable: a system level method of analysis seldom works at the project level and vice versa. In conjunction with the issues cited above, the operation and financing of the road and highway system is often the subject of criticisms that raise fundamental questions: What is the best way to determine how much money should be spent on a city or a county's road network? Is the size and quality of the rural road system appropriate? Is too much or too little money spent on road work? What parts of the system should be upgraded and in what sequence? Do truckers receive a hidden subsidy from other motorists? Do transportation professions evaluate road situations from too narrow of a perspective? In considering the issues and questions the author concluded that it would be of value if one could identify and develop a new method that would overcome the shortcomings of existing methods, be scalable, be capable of being understood by the general public, and utilize a broad viewpoint. After trying out a number of concepts, it appeared that a good approach would be to view the road network as a sub-component of a much larger system that also includes vehicles, people, goods-in-transit, and all the ancillary items needed to make the system function. Highway investment decisions could then be made on the basis of how they affect the total cost of operating the total system. A concept, named the "Total Cost of Transportation" method, was then developed and tested. The concept rests on four key principles: 1) that roads are but one sub-system of a much larger 'Road Based Transportation System', 2) that the size and activity level of the overall system are determined by market forces, 3) that the sum of everything expended, consumed, given up, or permanently reserved in building the system and generating the activity that results from the market forces represents the total cost of transportation, and 4) that the economic purpose of making road improvements is to minimize that total cost. To test the practical value of the theory, a special database and spreadsheet model of Iowa's county road network was developed. This involved creating a physical model to represent the size, characteristics, activity levels, and the rates at which the activities take place, developing a companion economic cost model, then using the two in tandem to explore a variety of issues. Ultimately, the theory and model proved capable of being used in full system, partial system, single segment, project, and general design guide levels of analysis. The method appeared to be capable of remedying many of the existing work method defects and to answer society's transportation questions from a new perspective.
Resumo:
For well over 100 years, the Working Stress Design (WSD) approach has been the traditional basis for geotechnical design with regard to settlements or failure conditions. However, considerable effort has been put forth over the past couple of decades in relation to the adoption of the Load and Resistance Factor Design (LRFD) approach into geotechnical design. With the goal of producing engineered designs with consistent levels of reliability, the Federal Highway Administration (FHWA) issued a policy memorandum on June 28, 2000, requiring all new bridges initiated after October 1, 2007, to be designed according to the LRFD approach. Likewise, regionally calibrated LRFD resistance factors were permitted by the American Association of State Highway and Transportation Officials (AASHTO) to improve the economy of bridge foundation elements. Thus, projects TR-573, TR-583 and TR-584 were undertaken by a research team at Iowa State University’s Bridge Engineering Center with the goal of developing resistance factors for pile design using available pile static load test data. To accomplish this goal, the available data were first analyzed for reliability and then placed in a newly designed relational database management system termed PIle LOad Tests (PILOT), to which this first volume of the final report for project TR-573 is dedicated. PILOT is an amalgamated, electronic source of information consisting of both static and dynamic data for pile load tests conducted in the State of Iowa. The database, which includes historical data on pile load tests dating back to 1966, is intended for use in the establishment of LRFD resistance factors for design and construction control of driven pile foundations in Iowa. Although a considerable amount of geotechnical and pile load test data is available in literature as well as in various State Department of Transportation files, PILOT is one of the first regional databases to be exclusively used in the development of LRFD resistance factors for the design and construction control of driven pile foundations. Currently providing an electronically organized assimilation of geotechnical and pile load test data for 274 piles of various types (e.g., steel H-shaped, timber, pipe, Monotube, and concrete), PILOT (http://srg.cce.iastate.edu/lrfd/) is on par with such familiar national databases used in the calibration of LRFD resistance factors for pile foundations as the FHWA’s Deep Foundation Load Test Database. By narrowing geographical boundaries while maintaining a high number of pile load tests, PILOT exemplifies a model for effective regional LRFD calibration procedures.
Resumo:
In the context of recent attempts to redefine the 'skin notation' concept, a position paper summarizing an international workshop on the topic stated that the skin notation should be a hazard indicator related to the degree of toxicity and the potential for transdermal exposure of a chemical. Within the framework of developing a web-based tool integrating this concept, we constructed a database of 7101 agents for which a percutaneous permeation constant can be estimated (using molecular weight and octanol-water partition constant), and for which at least one of the following toxicity indices could be retrieved: Inhalation occupational exposure limit (n=644), Oral lethal dose 50 (LD50, n=6708), cutaneous LD50 (n=1801), Oral no observed adverse effect level (NOAEL, n=1600), and cutaneous NOAEL (n=187). Data sources included the Registry of toxic effects of chemical substances (RTECS, MDL information systems, Inc.), PHYSPROP (Syracuse Research Corp.) and safety cards from the International Programme on Chemical Safety (IPCS). A hazard index, which corresponds to the product of exposure duration and skin surface exposed that would yield an internal dose equal to a toxic reference dose was calculated. This presentation provides a descriptive summary of the database, correlations between toxicity indices, and an example of how the web tool will help industrial hygienist decide on the possibility of a dermal risk using the hazard index.
Resumo:
Memoria de TFC en el que se analiza el estándar SQL:1999 y se compara con PostgreeSQL y Oracle.
Resumo:
This paper proposes an automatic hand detection system that combines the Fourier-Mellin Transform along with other computer vision techniques to achieve hand detection in cluttered scene color images. The proposed system uses the Fourier-Mellin Transform as an invariant feature extractor to perform RST invariant hand detection. In a first stage of the system a simple non-adaptive skin color-based image segmentation and an interest point detector based on corners are used in order to identify regions of interest that contains possible matches. A sliding window algorithm is then used to scan the image at different scales performing the FMT calculations only in the previously detected regions of interest and comparing the extracted FM descriptor of the windows with a hand descriptors database obtained from a train image set. The results of the performed experiments suggest the use of Fourier-Mellin invariant features as a promising approach for automatic hand detection.
Resumo:
This paper proposes an automatic hand detection system that combines the Fourier-Mellin Transform along with other computer vision techniques to achieve hand detection in cluttered scene color images. The proposed system uses the Fourier-Mellin Transform as an invariant feature extractor to perform RST invariant hand detection. In a first stage of the system a simple non-adaptive skin color-based image segmentation and an interest point detector based on corners are used in order to identify regions of interest that contains possible matches. A sliding window algorithm is then used to scan the image at different scales performing the FMT calculations only in the previously detected regions of interest and comparing the extracted FM descriptor of the windows with a hand descriptors database obtained from a train image set. The results of the performed experiments suggest the use of Fourier-Mellin invariant features as a promising approach for automatic hand detection.
Resumo:
Abstract
Resumo:
The aim of this paper is to bring into consideration a way of studying culture in infancy. An emphasis is put on the role that the material object plays in early interactive processes. Accounted as a cultural artefact, the object is seen as a fundamental element within triadic mother‐object‐ infant interactions and is believed to be a driving force both for communicative and cognitive development. In order to reconsider the importance of the object in child development and to present an approach of studying object construction, accounts in literature on early communication development and the importance of the object are reviewed and discussed under the light of the cultural specificity of the material object.