904 resultados para construction of imaginaries
Resumo:
Three-dimensional sequence stratigraphy is a potent exploration and development tool for the discovery of subtle stratigraphic traps. Reservoir morphology, heterogeneity and subtle stratigraphic trapping mechanisms can be better understood through systematic horizontal identification of sedimentary facies of systems tracts provided by three-dimensional attribute maps used as an important complement to the sequential analysis on the two-dimensional seismic lines and the well log data. On new prospects as well as on already-producing fields, the additional input of sequential analysis on three-dimensional data enables the identification, location and precise delimitation of new potentially productive zones. The first part of this paper presents four typical horizontal seismic facies assigned to the successive systems tracts of a third- or fourth-order sequence deposited in inner to outer neritic conditions on a elastic shelf. The construction of this synthetic representative sequence is based on the observed reproducibility of the horizontal seismic facies response to cyclic eustatic events on more than 35 sequences registered in the Gulf coast Plio-Pleistocene and Late Miocene, offshore Louisiana in the West Cameron region of the Gulf of Mexico. The second part shows how three-dimensional sequence stratigraphy can contribute in localizing and understanding sedimentary facies associated with productive zones. A case study in the early Middle Miocene Cibicides opima sands shows multiple stacked gas accumulations in the top slope fan, prograding wedge and basal transgressive systems tract of the third-order sequence between SB15.5 and SB 13.8 Ma.
Resumo:
The distribution limits of Crocidura russula (Hermann, 1780) and C. leucodon (Hermann, 1780) were investigated during an interval of 25 years in the bottom of the Rhone valley above Lake Geneva, Switzerland (total data set: 105 spatio-temporal occurrences, 1137 shrews). In 1975, the contact zone between the two species was situated in the region of Martigny. In 1999/2000, new sampling revealed three results: (1) The contact zone showed an upward shift of about 25 km. (2) In the expanded range of C. russula, the resident species has totally disappeared (confirmed by owl pellets analysis). (3) This demonstrates a dominance of C, russula over C. leucodon. Three hypotheses which may explain the range expansion of C. russula were evaluated: (1) habitat modification favouring linear dispersal due to the construction of a highway; (2) temporal event favoured by climate fluctuations, or (3) ongoing postglacial colonisation of Europe. Hypothesis 1 was rejected, because the progression of the shrews anticipated the construction. Hypothesis 3 received only weak support because range limits of C. russula in the region of Nice have been stable for thousands of years. Therefore hypothesis 2, admitting that ongoing climate change has facilitated range expansion, is the most probable.
Resumo:
Part I of this series of articles focused on the construction of graphical probabilistic inference procedures, at various levels of detail, for assessing the evidential value of gunshot residue (GSR) particle evidence. The proposed models - in the form of Bayesian networks - address the issues of background presence of GSR particles, analytical performance (i.e., the efficiency of evidence searching and analysis procedures) and contamination. The use and practical implementation of Bayesian networks for case pre-assessment is also discussed. This paper, Part II, concentrates on Bayesian parameter estimation. This topic complements Part I in that it offers means for producing estimates useable for the numerical specification of the proposed probabilistic graphical models. Bayesian estimation procedures are given a primary focus of attention because they allow the scientist to combine (his/her) prior knowledge about the problem of interest with newly acquired experimental data. The present paper also considers further topics such as the sensitivity of the likelihood ratio due to uncertainty in parameters and the study of likelihood ratio values obtained for members of particular populations (e.g., individuals with or without exposure to GSR).
Resumo:
Background: The ultimate goal of synthetic biology is the conception and construction of genetic circuits that are reliable with respect to their designed function (e.g. oscillators, switches). This task remains still to be attained due to the inherent synergy of the biological building blocks and to an insufficient feedback between experiments and mathematical models. Nevertheless, the progress in these directions has been substantial. Results: It has been emphasized in the literature that the architecture of a genetic oscillator must include positive (activating) and negative (inhibiting) genetic interactions in order to yield robust oscillations. Our results point out that the oscillatory capacity is not only affected by the interaction polarity but by how it is implemented at promoter level. For a chosen oscillator architecture, we show by means of numerical simulations that the existence or lack of competition between activator and inhibitor at promoter level affects the probability of producing oscillations and also leaves characteristic fingerprints on the associated period/amplitude features. Conclusions: In comparison with non-competitive binding at promoters, competition drastically reduces the region of the parameters space characterized by oscillatory solutions. Moreover, while competition leads to pulse-like oscillations with long-tail distribution in period and amplitude for various parameters or noisy conditions, the non-competitive scenario shows a characteristic frequency and confined amplitude values. Our study also situates the competition mechanism in the context of existing genetic oscillators, with emphasis on the Atkinson oscillator.
Resumo:
The construction of metagenomic libraries has permitted the study of microorganisms resistant to isolation and the analysis of 16S rDNA sequences has been used for over two decades to examine bacterial biodiversity. Here, we show that the analysis of random sequence reads (RSRs) instead of 16S is a suitable shortcut to estimate the biodiversity of a bacterial community from metagenomic libraries. We generated 10,010 RSRs from a metagenomic library of microorganisms found in human faecal samples. Then searched them using the program BLASTN against a prokaryotic sequence database to assign a taxon to each RSR. The results were compared with those obtained by screening and analysing the clones containing 16S rDNA sequences in the whole library. We found that the biodiversity observed by RSR analysis is consistent with that obtained by 16S rDNA. We also show that RSRs are suitable to compare the biodiversity between different metagenomic libraries. RSRs can thus provide a good estimate of the biodiversity of a metagenomic library and, as an alternative to 16S, this approach is both faster and cheaper.
Resumo:
The purpose of this investigation was to evaluate the Compensatory Wetland Mitigation Program at the Iowa Department of Transportation (DOT) in terms of regulatory compliance. Specific objectives included: 1) Determining if study sites meet the definition of a jurisdictional wetland. 2) Determining the degree of compliance with requirements specified in Clean Water Act Section 404 permits. A total of 24 study sites, in four age classes were randomly selected from over 80 sites currently managed by the Iowa DOT. Wetland boundaries were delineated in the field and mitigation compliance was determined by comparing the delineated wetland acreage at each study site to the total wetland acreage requirements specified in individual CWA Section 404 permits. Of the 24 sites evaluated in this study, 58 percent meet or exceed Section 404 permit requirements. Net gain ranged from 0.19 acre to 27.2 acres. Net loss ranged from 0.2 acre to 14.6 acres. The Denver Bypass 1 site was the worst performer, with zero acres of wetland present on the site and the Akron Wetland Mitigation Site was the best performer with slightly more than 27 acres over the permit requirement. Five of the 10 under-performing sites are more than five years post construction, two are five years post construction, one is three years post construction and the remaining two are one year post construction. Of the sites that meet or exceed permit requirements, approximately 93 percent are five years or less post construction and approximately 43 percent are only one year old. Only one of the 14 successful sites is more than five years old. Using Section 404 permit acreage requirements as the criteria for measuring success, 58 percent of the wetland mitigation sites investigated as part of this study are successful. Using net gain/loss as the measure of success, the Compensatory Wetland Mitigation Program has been successful in creating/restoring nearly 44 acres of wetland over what was required by permits.
Resumo:
For the last decade, high-resolution (HR)-MS has been associated with qualitative analyses while triple quadrupole MS has been associated with routine quantitative analyses. However, a shift of this paradigm is taking place: quantitative and qualitative analyses will be increasingly performed by HR-MS, and it will become the common 'language' for most mass spectrometrists. Most analyses will be performed by full-scan acquisitions recording 'all' ions entering the HR-MS with subsequent construction of narrow-width extracted-ion chromatograms. Ions will be available for absolute quantification, profiling and data mining. In parallel to quantification, metabotyping will be the next step in clinical LC-MS analyses because it should help in personalized medicine. This article is aimed to help analytical chemists who perform targeted quantitative acquisitions with triple quadrupole MS make the transition to quantitative and qualitative analyses using HR-MS. Guidelines for the acceptance criteria of mass accuracy and for the determination of mass extraction windows in quantitative analyses are proposed.
Resumo:
Stability berms are commonly constructed where roadway embankments cross soft or unstable ground conditions. Under certain circumstances, the construction of stability berms cause unfavorable environmental impacts, either directly or indirectly, through their effect on wetlands, endangered species habitat, stream channelization, longer culvert lengths, larger right-of-way purchases, and construction access limits. Due to an ever more restrictive regulatory environment, these impacts are problematic. The result is the loss of valuable natural resources to the public, lengthy permitting review processes for the department of transportation and permitting agencies, and the additional expenditures of time and money for all parties. The purpose of this project was to review existing stability berm alternatives for potential use in environmentally sensitive areas. The project also evaluates how stabilization technologies are made feasible, desirable, and cost-effective for transportation projects and determines which alternatives afford practical solutions for avoiding and minimizing impacts to environmentally sensitive areas. An online survey of engineers at state departments of transportation was also conducted to assess the frequency and cost effectiveness of the various stabilization technologies. Geotechnical engineers that responded to the survey overwhelmingly use geosynthetic reinforcement as a suitable and cost-effective solution for stabilizing embankments and cut slopes. Alternatively, chemical stabilization and installation of lime/cement columns is rarely a remediation measure employed by state departments of transportation.
Resumo:
In recent years, thin whitetopping has evolved as a viable rehabilitation technique for deteriorated asphalt cement concrete (ACC) pavements. Numerous projects have been constructed and tested; these projects allow researchers to identify the important elements contributing to the projects’ successes. These elements include surface preparation, overlay thickness, synthetic fiber reinforcement usage, joint spacing, and joint sealing. Although the main factors affecting thin whitetopping performance have been identified by previous research, questions still existed as to the optimum design incorporating these variables. The objective of this research is to investigate the interaction between these variables over time. Laboratory testing and field-testing were planned in order to accomplish the research objective. Laboratory testing involved shear testing of the bond between the portland cement concrete (PCC) overlay and the ACC surface. Field-testing involved falling weight deflectometer deflection responses, measurement of joint faulting and joint opening, and visual distress surveys on the 9.6-mile project. The project was located on Iowa Highway 13 extending north from the city of Manchester, Iowa, to Iowa Highway 3 in Delaware County. Variables investigated included ACC surface preparation, PCC thickness, synthetic fiber reinforcement usage, and joint spacing. This report documents the planning, equipment selection, construction, field changes, and construction concerns of the project built in 2002. The data from this research could be combined with historical data to develop a design specification for the construction of thin, unbonded overlays.
Resumo:
Le travail d'un(e) expert(e) en science forensique exige que ce dernier (cette dernière) prenne une série de décisions. Ces décisions sont difficiles parce qu'elles doivent être prises dans l'inévitable présence d'incertitude, dans le contexte unique des circonstances qui entourent la décision, et, parfois, parce qu'elles sont complexes suite à de nombreuse variables aléatoires et dépendantes les unes des autres. Etant donné que ces décisions peuvent aboutir à des conséquences sérieuses dans l'administration de la justice, la prise de décisions en science forensique devrait être soutenue par un cadre robuste qui fait des inférences en présence d'incertitudes et des décisions sur la base de ces inférences. L'objectif de cette thèse est de répondre à ce besoin en présentant un cadre théorique pour faire des choix rationnels dans des problèmes de décisions rencontrés par les experts dans un laboratoire de science forensique. L'inférence et la théorie de la décision bayésienne satisfont les conditions nécessaires pour un tel cadre théorique. Pour atteindre son objectif, cette thèse consiste de trois propositions, recommandant l'utilisation (1) de la théorie de la décision, (2) des réseaux bayésiens, et (3) des réseaux bayésiens de décision pour gérer des problèmes d'inférence et de décision forensiques. Les résultats présentent un cadre uniforme et cohérent pour faire des inférences et des décisions en science forensique qui utilise les concepts théoriques ci-dessus. Ils décrivent comment organiser chaque type de problème en le décomposant dans ses différents éléments, et comment trouver le meilleur plan d'action en faisant la distinction entre des problèmes de décision en une étape et des problèmes de décision en deux étapes et en y appliquant le principe de la maximisation de l'utilité espérée. Pour illustrer l'application de ce cadre à des problèmes rencontrés par les experts dans un laboratoire de science forensique, des études de cas théoriques appliquent la théorie de la décision, les réseaux bayésiens et les réseaux bayésiens de décision à une sélection de différents types de problèmes d'inférence et de décision impliquant différentes catégories de traces. Deux études du problème des deux traces illustrent comment la construction de réseaux bayésiens permet de gérer des problèmes d'inférence complexes, et ainsi surmonter l'obstacle de la complexité qui peut être présent dans des problèmes de décision. Trois études-une sur ce qu'il faut conclure d'une recherche dans une banque de données qui fournit exactement une correspondance, une sur quel génotype il faut rechercher dans une banque de données sur la base des observations faites sur des résultats de profilage d'ADN, et une sur s'il faut soumettre une trace digitale à un processus qui compare la trace avec des empreintes de sources potentielles-expliquent l'application de la théorie de la décision et des réseaux bayésiens de décision à chacune de ces décisions. Les résultats des études des cas théoriques soutiennent les trois propositions avancées dans cette thèse. Ainsi, cette thèse présente un cadre uniforme pour organiser et trouver le plan d'action le plus rationnel dans des problèmes de décisions rencontrés par les experts dans un laboratoire de science forensique. Le cadre proposé est un outil interactif et exploratoire qui permet de mieux comprendre un problème de décision afin que cette compréhension puisse aboutir à des choix qui sont mieux informés. - Forensic science casework involves making a sériés of choices. The difficulty in making these choices lies in the inévitable presence of uncertainty, the unique context of circumstances surrounding each décision and, in some cases, the complexity due to numerous, interrelated random variables. Given that these décisions can lead to serious conséquences in the admin-istration of justice, forensic décision making should be supported by a robust framework that makes inferences under uncertainty and décisions based on these inferences. The objective of this thesis is to respond to this need by presenting a framework for making rational choices in décision problems encountered by scientists in forensic science laboratories. Bayesian inference and décision theory meets the requirements for such a framework. To attain its objective, this thesis consists of three propositions, advocating the use of (1) décision theory, (2) Bayesian networks, and (3) influence diagrams for handling forensic inference and décision problems. The results present a uniform and coherent framework for making inferences and décisions in forensic science using the above theoretical concepts. They describe how to organize each type of problem by breaking it down into its différent elements, and how to find the most rational course of action by distinguishing between one-stage and two-stage décision problems and applying the principle of expected utility maximization. To illustrate the framework's application to the problems encountered by scientists in forensic science laboratories, theoretical case studies apply décision theory, Bayesian net-works and influence diagrams to a selection of différent types of inference and décision problems dealing with différent catégories of trace evidence. Two studies of the two-trace problem illustrate how the construction of Bayesian networks can handle complex inference problems, and thus overcome the hurdle of complexity that can be present in décision prob-lems. Three studies-one on what to conclude when a database search provides exactly one hit, one on what genotype to search for in a database based on the observations made on DNA typing results, and one on whether to submit a fingermark to the process of comparing it with prints of its potential sources-explain the application of décision theory and influ¬ence diagrams to each of these décisions. The results of the theoretical case studies support the thesis's three propositions. Hence, this thesis présents a uniform framework for organizing and finding the most rational course of action in décision problems encountered by scientists in forensic science laboratories. The proposed framework is an interactive and exploratory tool for better understanding a décision problem so that this understanding may lead to better informed choices.
Resumo:
The following dissertation proposes a qualitative approach on the matter ofadvertising, society and construction of identity, based on the effect of householdappliances commercials in constructing female identity of Spanish women today.Conclusions will be drawn based on a juxtaposition of social background andadvertising content and on how Spanish women of today perceive the evolution offemale imagery depicted in advertisements. The aim is to demonstrate how muchcommercials mirrors society and how far it reinforces paradigms no longer existing
Resumo:
One of the most important issues in portland cement concrete pavement research today is surface characteristics. The issue is one of balancing surface texture construction with the need for durability, skid resistance, and noise reduction. The National Concrete Pavement Technology Center at Iowa State University, in conjunction with the Federal Highway Administration, American Concrete Pavement Association, International Grinding and Grooving Association, Iowa Highway Research Board, and other states, have entered into a three-part National Surface Characteristics Program to resolve the balancing problem. As a portion of Part 2, this report documents the construction of 18 separate pavement surfaces for use in the first level of testing for the national project. It identifies the testing to be done and the limitations observed in the construction process. The results of the actual tests will be included in the subsequent national study reports.
Resumo:
The objectives were to develop and evaluate an assistive technology for the use of the male condom by visually impaired men. It was a technology development study with the participation of seven subjects. Three workshops were performed between April and May of 2010; they were all filmed and the statements of the participants were transcribed and analyzed by content. Three categories were established: Sexuality of the visually impaired; Utilization of the text, For avoiding STDs, condoms we will use, divided in two subcategories, Concept discussion and Text evaluation; and Construction of a simple penile prosthesis. The knowledge transmitted related to STD, the utilization of the condom on the penile prosthesis made by the subjects themselves, and the interaction during the workshops were effective factors for the study. In the context of sexual health, the necessity of developing works involving the visually impaired was noted, addressing sexually transmitted diseases and focusing on the use of the condom by this population.
Resumo:
The sample dimension, types of variables, format used for measurement, and construction of instruments to collect valid and reliable data must be considered during the research process. In the social and health sciences, and more specifically in nursing, data-collection instruments are usually composed of latent variables or variables that cannot be directly observed. Such facts emphasize the importance of deciding how to measure study variables (using an ordinal scale or a Likert or Likert-type scale). Psychometric scales are examples of instruments that are affected by the type of variables that comprise them, which could cause problems with measurement and statistical analysis (parametric tests versus non-parametric tests). Hence, investigators using these variables must rely on suppositions based on simulation studies or recommendations based on scientific evidence in order to make the best decisions.
Resumo:
Our task in this paper is to analyze the organization of trading in the era of quantitative finance. To do so, we conduct an ethnography of arbitrage, the trading strategy that best exemplifies finance in the wake of the quantitative revolution. In contrast to value and momentum investing, we argue, arbitrage involves an art of association-the construction of equivalence (comparability) of properties across different assets. In place of essential or relational characteristics, the peculiar valuation that takes place in arbitrage is based on an operation that makes something the measure of something else-associating securities to each other. The process of recognizing opportunities and the practices of making novel associations are shaped by the specific socio-spatial and socio-technical configurations of the trading room. Calculation is distributed across persons and instruments as the trading room organizes interaction among diverse principles of valuation.