34 resultados para Unified Transform Kernel
Resumo:
Members of the plant NITRATE TRANSPORTER 1/PEPTIDE TRANSPORTER (NRT1/PTR) family display protein sequence homology with the SLC15/PepT/PTR/POT family of peptide transporters in animals. In comparison to their animal and bacterial counterparts, these plant proteins transport a wide variety of substrates: nitrate, peptides, amino acids, dicarboxylates, glucosinolates, IAA, and ABA. The phylogenetic relationship of the members of the NRT1/PTR family in 31 fully sequenced plant genomes allowed the identification of unambiguous clades, defining eight subfamilies. The phylogenetic tree was used to determine a unified nomenclature of this family named NPF, for NRT1/PTR FAMILY. We propose that the members should be named accordingly: NPFX.Y, where X denotes the subfamily and Y the individual member within the species.
Resumo:
ABSTRACT: Fourier transform infrared spectroscopy (FTIRS) can provide detailed information on organic and minerogenic constituents of sediment records. Based on a large number of sediment samples of varying age (0�340 000 yrs) and from very diverse lake settings in Antarctica, Argentina, Canada, Macedonia/Albania, Siberia, and Sweden, we have developed universally applicable calibration models for the quantitative determination of biogenic silica (BSi; n = 816), total inorganic carbon (TIC; n = 879), and total organic carbon (TOC; n = 3164) using FTIRS. These models are based on the differential absorbance of infrared radiation at specific wavelengths with varying concentrations of individual parameters, due to molecular vibrations associated with each parameter. The calibration models have low prediction errors and the predicted values are highly correlated with conventionally measured values (R = 0.94�0.99). Robustness tests indicate the accuracy of the newly developed FTIRS calibration models is similar to that of conventional geochemical analyses. Consequently FTIRS offers a useful and rapid alternative to conventional analyses for the quantitative determination of BSi, TIC, and TOC. The rapidity, cost-effectiveness, and small sample size required enables FTIRS determination of geochemical properties to be undertaken at higher resolutions than would otherwise be possible with the same resource allocation, thus providing crucial sedimentological information for climatic and environmental reconstructions.
Resumo:
Abstract We demonstrate the use of Fourier transform infrared spectroscopy (FTIRS) to make quantitative measures of total organic carbon (TOC), total inorganic carbon (TIC) and biogenic silica (BSi) concentrations in sediment. FTIRS is a fast and costeffective technique and only small sediment samples are needed (0.01 g). Statistically significant models were developed using sediment samples from northern Sweden and were applied to sediment records from Sweden, northeast Siberia and Macedonia. The correlation between FTIRS-inferred values and amounts of biogeochemical constituents assessed conventionally varied between r = 0.84–0.99 for TOC, r = 0.85– 0.99 for TIC, and r = 0.68–0.94 for BSi. Because FTIR spectra contain information on a large number of both inorganic and organic components, there is great potential for FTIRS to become an important tool in paleolimnology.
Resumo:
Software architecture consists of a set of design choices that can be partially expressed in form of rules that the implementation must conform to. Architectural rules are intended to ensure properties that fulfill fundamental non-functional requirements. Verifying architectural rules is often a non- trivial activity: available tools are often not very usable and support only a narrow subset of the rules that are commonly specified by practitioners. In this paper we present a new highly-readable declarative language for specifying architectural rules. With our approach, users can specify a wide variety of rules using a single uniform notation. Rules can get tested by third-party tools by conforming to pre-defined specification templates. Practitioners can take advantage of the capabilities of a growing number of testing tools without dealing with them directly.
Resumo:
Methods for tracking an object have generally fallen into two groups: tracking by detection and tracking through local optimization. The advantage of detection-based tracking is its ability to deal with target appearance and disappearance, but it does not naturally take advantage of target motion continuity during detection. The advantage of local optimization is efficiency and accuracy, but it requires additional algorithms to initialize tracking when the target is lost. To bridge these two approaches, we propose a framework for unified detection and tracking as a time-series Bayesian estimation problem. The basis of our approach is to treat both detection and tracking as a sequential entropy minimization problem, where the goal is to determine the parameters describing a target in each frame. To do this we integrate the Active Testing (AT) paradigm with Bayesian filtering, and this results in a framework capable of both detecting and tracking robustly in situations where the target object enters and leaves the field of view regularly. We demonstrate our approach on a retinal tool tracking problem and show through extensive experiments that our method provides an efficient and robust tracking solution.
Resumo:
Quarks were introduced 50 years ago opening the road towards our understanding of the elementary constituents of matter and their fundamental interactions. Since then, a spectacular progress has been made with important discoveries that led to the establishment of the Standard Theory that describes accurately the basic constituents of the observable matter, namely quarks and leptons, interacting with the exchange of three fundamental forces, the weak, electromagnetic and strong force. Particle physics is now entering a new era driven by the quest of understanding of the composition of our Universe such as the unobservable (dark) matter, the hierarchy of masses and forces, the unification of all fundamental interactions with gravity in a consistent quantum framework, and several other important questions. A candidate theory providing answers to many of these questions is string theory that replaces the notion of point particles by extended objects, such as closed and open strings. In this short note, I will give a brief overview of string unification, describe in particular how quarks and leptons can emerge and discuss what are possible predictions for particle physics and cosmology that could test these ideas.
Resumo:
Quarks were introduced 50 years ago opening the road towards our understanding of the elementary constituents of matter and their fundamental interactions. Since then, a spectacular progress has been made with important discoveries that led to the establishment of the Standard Theory that describes accurately the basic constituents of the observable matter, namely quarks and leptons, interacting with the exchange of three fundamental forces, the weak, electromagnetic and strong force. Particle physics is now entering a new era driven by the quest of understanding of the composition of our Universe such as the unobservable (dark) matter, the hierarchy of masses and forces, the unification of all fundamental interactions with gravity in a consistent quantum framework, and several other important questions. A candidate theory providing answers to many of these questions is string theory that replaces the notion of point particles by extended objects, such as closed and open strings. In this short note, I will give a brief overview of string unification, describe in particular how quarks and leptons can emerge and discuss what are possible predictions for particle physics and cosmology that could test these ideas.
Resumo:
Architectural decisions can be interpreted as structural and behavioral constraints that must be enforced in order to guarantee overarching qualities in a system. Enforcing those constraints in a fully automated way is often challenging and not well supported by current tools. Current approaches for checking architecture conformance either lack in usability or offer poor options for adaptation. To overcome this problem we analyze the current state of practice and propose an approach based on an extensible, declarative and empirically-grounded specification language. This solution aims at reducing the overall cost of setting up and maintaining an architectural conformance monitoring environment by decoupling the conceptual representation of a user-defined rule from its technical specification prescribed by the underlying analysis tools. By using a declarative language, we are able to write tool-agnostic rules that are simple enough to be understood by untrained stakeholders and, at the same time, can be can be automatically processed by a conformance checking validator. Besides addressing the issue of cost, we also investigate opportunities for increasing the value of conformance checking results by assisting the user towards the full alignment of the implementation with respect to its architecture. In particular, we show the benefits of providing actionable results by introducing a technique which automatically selects the optimal repairing solutions by means of simulation and profit-based quantification. We perform various case studies to show how our approach can be successfully adopted to support truly diverse industrial projects. We also investigate the dynamics involved in choosing and adopting a new automated conformance checking solution within an industrial context. Our approach reduces the cost of conformance checking by avoiding the need for an explicit management of the involved validation tools. The user can define rules using a convenient high-level DSL which automatically adapts to emerging analysis requirements. Increased usability and modular customization ensure lower costs and a shorter feedback loop.
Resumo:
kdens produces univariate kernel density estimates and graphs the result. kdens supplements official Stata's kdensity. Important additions are: adaptive (i.e. variable bandwidth) kernel density estimation, several automatic bandwidth selectors including the Sheather-Jones plug-in estimator, pointwise variability bands and confidence intervals, boundary correction for variables with bounded domain, fast binned approximation estimation. Note that the moremata package, also available from SSC, is required.
Resumo:
devcon transforms the coefficients of 0/1 dummy variables so that they reflect deviations from the "grand mean" rather than deviations from the reference category (the transformed coefficients are equivalent to those obtained by the so called "effects coding") and adds the coefficient for the reference category. The variance-covariance matrix of the estimates is transformed accordingly. The transformed estimated can be used with post estimation procedures. In particular, devcon can be used to solve the identification problem for dummy variable effects in the so-called Blinder-Oaxaca decomposition (see the oaxaca package).