907 resultados para minimal ontological overlap
Resumo:
Synthetic biology seeks to enable programmed control of cellular behavior though engineered biological systems. These systems typically consist of synthetic circuits that function inside, and interact with, complex host cells possessing pre-existing metabolic and regulatory networks. Nevertheless, while designing systems, a simple well-defined interface between the synthetic gene circuit and the host is frequently assumed. We describe the generation of robust but unexpected oscillations in the densities of bacterium Escherichia coli populations by simple synthetic suicide circuits containing quorum components and a lysis gene. Contrary to design expectations, oscillations required neither the quorum sensing genes (luxR and luxI) nor known regulatory elements in the P(luxI) promoter. Instead, oscillations were likely due to density-dependent plasmid amplification that established a population-level negative feedback. A mathematical model based on this mechanism captures the key characteristics of oscillations, and model predictions regarding perturbations to plasmid amplification were experimentally validated. Our results underscore the importance of plasmid copy number and potential impact of "hidden interactions" on the behavior of engineered gene circuits - a major challenge for standardizing biological parts. As synthetic biology grows as a discipline, increasing value may be derived from tools that enable the assessment of parts in their final context.
Resumo:
BACKGROUND: Phenotypic differences among species have long been systematically itemized and described by biologists in the process of investigating phylogenetic relationships and trait evolution. Traditionally, these descriptions have been expressed in natural language within the context of individual journal publications or monographs. As such, this rich store of phenotype data has been largely unavailable for statistical and computational comparisons across studies or integration with other biological knowledge. METHODOLOGY/PRINCIPAL FINDINGS: Here we describe Phenex, a platform-independent desktop application designed to facilitate efficient and consistent annotation of phenotypic similarities and differences using Entity-Quality syntax, drawing on terms from community ontologies for anatomical entities, phenotypic qualities, and taxonomic names. Phenex can be configured to load only those ontologies pertinent to a taxonomic group of interest. The graphical user interface was optimized for evolutionary biologists accustomed to working with lists of taxa, characters, character states, and character-by-taxon matrices. CONCLUSIONS/SIGNIFICANCE: Annotation of phenotypic data using ontologies and globally unique taxonomic identifiers will allow biologists to integrate phenotypic data from different organisms and studies, leveraging decades of work in systematics and comparative morphology.
Resumo:
Telecentric optical computed tomography (optical-CT) is a state-of-the-art method for visualizing and quantifying 3-dimensional dose distributions in radiochromic dosimeters. In this work a prototype telecentric system (DFOS-Duke Fresnel Optical-CT Scanner) is evaluated which incorporates two substantial design changes: the use of Fresnel lenses (reducing lens costs from $10-30K t0 $1-3K) and the use of a 'solid tank' (which reduces noise, and the volume of refractively matched fluid from 1 ltr to 10 cc). The efficacy of DFOS was evaluated by direct comparison against commissioned scanners in our lab. Measured dose distributions from all systems were compared against the predicted dose distributions from a commissioned treatment planning system (TPS). Three treatment plans were investigated including a simple four-field box treatment, a multiple small field delivery, and a complex IMRT treatment. Dosimeters were imaged within 2 h post irradiation, using consistent scanning techniques (360 projections acquired at 1 degree intervals, reconstruction at 2mm). DFOS efficacy was evaluated through inspection of dose line-profiles, and 2D and 3D dose and gamma maps. DFOS/TPS gamma pass rates with 3%/3mm dose difference/distance-to-agreement criteria ranged from 89.3% to 92.2%, compared to from 95.6% to 99.0% obtained with the commissioned system. The 3D gamma pass rate between the commissioned system and DFOS was 98.2%. The typical noise rates in DFOS reconstructions were up to 3%, compared to under 2% for the commissioned system. In conclusion, while the introduction of a solid tank proved advantageous with regards to cost and convenience, further work is required to improve the image quality and dose reconstruction accuracy of the new DFOS optical-CT system.
Resumo:
Motivated by the Minimal Dark Matter scenario, we consider the annihilation into gamma rays of candidates in the fermionic 5-plet and scalar 7-plet representations of SU(2)L, taking into account both the Sommerfeld effect and the internal bremsstrahlung. Assuming the Einasto profile, we show that present measurements of the Galactic Center by the H.E.S.S. instrument exclude the 5-plet and 7-plet as the dominant form of dark matter for masses between 1 TeV and 20 TeV, in particular, the 5-plet mass leading to the observed dark matter density via thermal freeze-out. We also discuss prospects for the upcoming Cherenkov Telescope Array, which will be able to probe even heavier dark matter masses, including the scenario where the scalar 7-plet is thermally produced.
Resumo:
A natural approach to representing and reasoning about temporal propositions (i.e., statements with time-dependent truth-values) is to associate them with time elements. In the literature, there are three choices regarding the primitive for the ontology of time: (1) instantaneous points, (2) durative intervals and (3) both points and intervals. Problems may arise when one conflates different views of temporal structure and questions whether some certain types of temporal propositions can be validly and meaningfully associated with different time elements. In this paper, we shall summarize an ontological glossary with respect to time elements, and diversify a wider range of meta-predicates for ascribing temporal propositions to time elements. Based on these, we shall also devise a versatile categorization of temporal propositions, which can subsume those representative categories proposed in the literature, including that of Vendler, of McDermott, of Allen, of Shoham, of Galton and of Terenziani and Torasso. It is demonstrated that the new categorization of propositions, together with the proposed range of meta-predicates, provides the expressive power for modeling some typical temporal terms/phenomena, such as starting-instant, stopping-instant, dividing-instant, instigation, termination and intermingling etc.
Resumo:
Time-series and sequences are important patterns in data mining. Based on an ontology of time-elements, this paper presents a formal characterization of time-series and state-sequences, where a state denotes a collection of data whose validation is dependent on time. While a time-series is formalized as a vector of time-elements temporally ordered one after another, a state-sequence is denoted as a list of states correspondingly ordered by a time-series. In general, a time-series and a state-sequence can be incomplete in various ways. This leads to the distinction between complete and incomplete time-series, and between complete and incomplete state-sequences, which allows the expression of both absolute and relative temporal knowledge in data mining.
Resumo:
Seabirds are effective samplers of the marine environment, and can be used to measure resource partitioning among species and sites via food loads destined for chicks. We examined the composition, overlap, and relationships to changing climate and oceanography of 3,216 food loads from Least, Crested, and Whiskered Auklets (Aethia pusilla, A. cristatella, A. pygmaea) breeding in Alaska during 1994–2006. Meals comprised calanoid copepods (Neocalanus spp.) and euphausiids (Thysanoessa spp.) that reflect secondary marine productivity, with no difference among Buldir, Kiska, and Kasatochi islands across 585 km of the Aleutian Islands. Meals were very similar among species (mean Least–Crested Auklet overlap C = 0.68; Least–Whiskered Auklet overlap C = 0.96) and among sites, indicating limited partitioning of prey resources for auklets feeding chicks. The biomass of copepods and euphausiids in Least and Crested Auklet food loads was related negatively to the summer (June–July–August) North Pacific Gyre Oscillation, while in Whiskered Auklet food loads, this was negatively related to the winter (December–January–February) Pacific Decadal Oscillation, both of which track basin-wide sea-surface temperature (SST) anomalies. We found a significant quadratic relationship between the biomass of calanoid copepods in Least Auklet food loads at all three study sites and summer (June–July) SST, with maximal copepod biomass between 3–6°C (r 2 = 0.71). Outside this temperature range, zooplankton becomes less available to auklets through delayed development. Overall, our results suggest that auklets are able to buffer climate-mediated bottom-up forcing of demographic parameters like productivity, as the composition of chick meals has remained constant over the course of our study.
Resumo:
Overfishing is arguably the greatest ecological threat facing the oceans, yet catches of many highly migratory fishes including oceanic sharks remain largely unregulated with poor monitoring and data reporting. Oceanic shark conservation is hampered by basic knowledge gaps about where sharks aggregate across population ranges and precisely where they overlap with fishers. Using satellite tracking data from six shark species across the North Atlantic, we show that pelagic sharks occupy predictable habitat ‘hotspots’ of high space use. Movement modelling showed sharks preferred habitats characterised by strong sea-surface-temperature gradients (fronts) over other available habitats. However, simultaneous Global Positioning System (GPS) tracking of the entire Spanish and Portuguese longline-vessel fishing fleets show an 80% overlap of fished areas with hotspots, potentially increasing shark susceptibility to fishing exploitation. Regions of high overlap between oceanic tagged sharks and longliners included the North Atlantic Current/Labrador Current convergence zone and the Mid-Atlantic Ridge south-west of the Azores. In these main regions, and sub-areas within them, shark/vessel co-occurrence was spatially and temporally persistent between years, highlighting how broadly the fishing exploitation efficiently ‘tracks’ oceanic sharks within their space-use hotspots year-round. Given this intense focus of longliners on shark hotspots our study argues the need for international catch limits for pelagic sharks and identifies a future role of combining fine-scale fish and vessel telemetry to inform the ocean-scale management of fisheries.
Resumo:
In this paper, I argue that there is an inconsistency between two presentist doctrines: that of ontological symmetry and asymmetry of fixity. The former refers to the presentist belief that the past and future are equally unreal. The latter refers to the A-Theoretic intuition that the past is closed or actual, and the future is open or potential. My position in this paper is that the presentist is unable to account for the temporal asymmetry that is so fundamentally a part of her theory. In Section I, I briefly outline a recent defence of presentism due to Craig, and argue that a flaw in this defence highlights the tension between the presentist's doctrines of ontological symmetry and asymmetry of fixity. In Section II, I undertake an investigation, on the presentist's behalf, in order to determine whether she is capable of reconciling these two doctrines. In the course of the investigation, I consider different asymmetries, other than that of ontology, which might be said fundamentally to constitute temporal asymmetry, and the asymmetry of fixity in particular. In Section III, I also consider whether the presentist is able to avail herself of some of the standard B-Theoretic accounts of the asymmetry of fixity, and argue that she cannot. Finally, I conclude that temporal asymmetry cannot be accounted for (or explained) other than through the postulation of an ontological asymmetry.
Resumo:
Hunter and Konieczny explored the relationships between measures of inconsistency for a belief base and the minimal inconsistent subsets of that belief base in several of their papers. In particular, an inconsistency value termed MIVC, defined from minimal inconsistent subsets, can be considered as a Shapley Inconsistency Value. Moreover, it can be axiomatized completely in terms of five simple axioms. MinInc, one of the five axioms, states that each minimal inconsistent set has the same amount of conflict. However, it conflicts with the intuition illustrated by the lottery paradox, which states that as the size of a minimal inconsistent belief base increases, the degree of inconsistency of that belief base becomes smaller. To address this, we present two kinds of revised inconsistency measures for a belief base from its minimal inconsistent subsets. Each of these measures considers the size of each minimal inconsistent subset as well as the number of minimal inconsistent subsets of a belief base. More specifically, we first present a vectorial measure to capture the inconsistency for a belief base, which is more discriminative than MIVC. Then we present a family of weighted inconsistency measures based on the vectorial inconsistency measure, which allow us to capture the inconsistency for a belief base in terms of a single numerical value as usual. We also show that each of the two kinds of revised inconsistency measures can be considered as a particular Shapley Inconsistency Value, and can be axiomatically characterized by the corresponding revised axioms presented in this paper.