633 resultados para logarithmic sprayer


Relevância:

10.00% 10.00%

Publicador:

Resumo:

This study utilised recent developments in forensic aromatic hydrocarbon fingerprint analysis to characterise and identify specific biogenic, pyrogenic and petrogenic contamination. The fingerprinting and data interpretation techniques discussed include the recognition of: The distribution patterns of hydrocarbons (alkylated naphthalene, phenanthrene, dibenzothiophene, fluorene, chrysene and phenol isomers), • Analysis of “source-specific marker” compounds (individual saturated hydrocarbons, including n-alkanes (n-C5 through 0-C40) • Selected benzene, toluene, ethylbenzene and xylene isomers (BTEX), • The recalcitrant isoprenoids; pristane and phytane and • The determination of diagnostic ratios of specific petroleum / non-petroleum constituents, and the application of various statistical and numerical analysis tools. An unknown sample from the Irish Environmental Protection Agency (EPA) for origin characterisation was subjected to analysis by gas chromatography utilising both flame ionisation and mass spectral detection techniques in comparison to known reference materials. The percentage of the individual Polycyclic Aromatic Hydrocarbons (PAIIs) and biomarker concentrations in the unknown sample were normalised to the sum of the analytes and the results were compared with the corresponding results with a range of reference materials. In addition, to the determination of conventional diagnostic PAH and biomarker ratios, a number of “source-specific markers” isomeric PAHs within the same alkylation levels were determined, and their relative abundance ratios were computed in order to definitively identify and differentiate the various sources. Statistical logarithmic star plots were generated from both sets of data to give a pictorial representation of the comparison between the unknown sample and reference products. The study successfully characterised the unknown sample as being contaminated with a “coal tar” and clearly demonstrates the future role of compound ratio analysis (CORAT) in the identification of possible source contaminants.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

We consider multidimensional backward stochastic differential equations (BSDEs). We prove the existence and uniqueness of solutions when the coefficient grow super-linearly, and moreover, can be neither locally Lipschitz in the variable y nor in the variable z. This is done with super-linear growth coefficient and a p-integrable terminal condition (p & 1). As application, we establish the existence and uniqueness of solutions to degenerate semilinear PDEs with superlinear growth generator and an Lp-terminal data, p & 1. Our result cover, for instance, the case of PDEs with logarithmic nonlinearities.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Variational steepest descent approximation schemes for the modified Patlak-Keller-Segel equation with a logarithmic interaction kernel in any dimension are considered. We prove the convergence of the suitably interpolated in time implicit Euler scheme, defined in terms of the Euclidean Wasserstein distance, associated to this equation for sub-critical masses. As a consequence, we recover the recent result about the global in time existence of weak-solutions to the modified Patlak-Keller-Segel equation for the logarithmic interaction kernel in any dimension in the sub-critical case. Moreover, we show how this method performs numerically in one dimension. In this particular case, this numerical scheme corresponds to a standard implicit Euler method for the pseudo-inverse of the cumulative distribution function. We demonstrate its capabilities to reproduce easily without the need of mesh-refinement the blow-up of solutions for super-critical masses.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Les xarxes híbrides satèl·lit-terrestre ofereixen connectivitat a zones remotes i aïllades i permeten resoldre nombrosos problemes de comunicacions. No obstant, presenten diversos reptes, ja que realitzen la comunicació per un canal mòbil terrestre i un canal satèl·lit contigu. Un d'aquests reptes és trobar mecanismes per realitzar eficientment l'enrutament i el control de flux, de manera conjunta. L'objectiu d'aquest projecte és simular i estudiar algorismes existents que resolguin aquests problemes, així com proposar-ne de nous, mitjançant diverses tècniques d'optimització convexa. A partir de les simulacions realitzades en aquest estudi, s'han analitzat àmpliament els diversos problemes d'enrutament i control de flux, i s'han avaluat els resultats obtinguts i les prestacions dels algorismes emprats. En concret, s'han implementat de manera satisfactòria algorismes basats en el mètode de descomposició dual, el mètode de subgradient, el mètode de Newton i el mètode de la barrera logarítmica, entre d'altres, per tal de resoldre els problemes d'enrutament i control de flux plantejats.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Treball de recerca realitzat per un alumne d'ensenyament secundari i guardonat amb un Premi CIRIT per fomentar l'esperit científic del Jovent l'any 2009. Aquest treball de recerca es basa en l'experimentació i, posteriorment, l'obtenció i anàlisi de resultats de l'experiment creador d'anells de Liesegang. Aquest experiment, consistent en la precipitació d'un compost en una base gelificada formant anells distanciats logarítmicament els uns dels altres, ha estat durant més d'un segle objecte d'investigació de moltíssims científics, els quals no han sabut mai treure'n una explicació lògica i raonable d'aquest rar comportament. L'autor ha pretès recrear els curiosos anells intentant formar-los amb diferents inhibidors i compostos als trobats en la bibliografia. Després de realitzar més d'una trentena d'experiments, s'ha realitzat una anàlisi exhaustiva dels resultats. Aquest apartat ha estat un dels més enriquidors, ja que s'han dut a terme en ell comparacions sorprenents i troballes molt curioses, com per exemple la similitud entre els anells de Liesegang i les estructures de Turing, la qual intenta explicar les formes presents en els ocels dels éssers vius; i l'aparició d'anells de Liesegang segons l’òptica visual, efecte inexistent en l’àmplia bibliografia consultada. A més a més, també s'han efectuat una sèrie d'estudis: un en què es confirmen les distàncies logarítmiques entre els anells i on es realitza una comparació entre les dades empíriques i el patró matemàtic; i un altre en què s'estudia el comportament dels anells al variar els factors que regulen la velocitat de reacció.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

When dealing with sustainability we are concerned with the biophysical as well as the monetary aspects of economic and ecological interactions. This multidimensional approach requires that special attention is given to dimensional issues in relation to curve fitting practice in economics. Unfortunately, many empirical and theoretical studies in economics, as well as in ecological economics, apply dimensional numbers in exponential or logarithmic functions. We show that it is an analytical error to put a dimensional unit x into exponential functions ( a x ) and logarithmic functions ( x a log ). Secondly, we investigate the conditions of data sets under which a particular logarithmic specification is superior to the usual regression specification. This analysis shows that logarithmic specification superiority in terms of least square norm is heavily dependent on the available data set. The last section deals with economists’ “curve fitting fetishism”. We propose that a distinction be made between curve fitting over past observations and the development of a theoretical or empirical law capable of maintaining its fitting power for any future observations. Finally we conclude this paper with several epistemological issues in relation to dimensions and curve fitting practice in economics

Relevância:

10.00% 10.00%

Publicador:

Resumo:

In the present paper, we study the geometric discrepancy with respect to families of rotated rectangles. The well-known extremal cases are the axis-parallel rectangles (logarithmic discrepancy) and rectangles rotated in all possible directions (polynomial discrepancy). We study several intermediate situations: lacunary sequences of directions, lacunary sets of finite order, and sets with small Minkowski dimension. In each of these cases, extensions of a lemma due to Davenport allow us to construct appropriate rotations of the integer lattice which yield small discrepancy.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Trisomy-21 (Down syndrome) is the most frequent chromosomal abnorm- ality but only one third of cases would be detected by amniocentesis based on maternal age alone. Serum screening tests in the early second trimester increase the detec- tion rate to 60-65%, and more recently it was found that such screening was also possible in the first trimester by quantifying a diffe- rent panel of markers. The concen- trations of these placental proteins are strongly dependent on gestatio- nal age; thus control medians must be established and precise dating is essential. Serum chorionic gonado- trophin (HCG) levels were recently found to be increased in IVF preg- nancies compared to spontaneous gestations, leading to a falsely ele- vated trisomy screening risk. The aim of this preliminary study was to find out whether, in the first-trime- ster screening, the markers similarly differed between IVF and spontane- ous pregnancies which would call for the establishment of separate normal medians for IVF patients. We compared 24 pregnancies ob- tained after ovarian stimulation and IVF with six women after thawed embryo transfer (unstimulated cycle) and 63 gestation- and maternal-age matched spontaneously pregnant controls. A single serum was ob- tained between 6 and 16 weeks of gestation and various placental protein levels determined by im- munometric assays. Serum levels of pregnancy-associated plasma protein A (PAPP-A), which is the major marker in the first-trimes- ter screening test, were reduced in IVF pregnancies: after 9 weeks of gestation, multiples of median (MoMs) ranged between 0.23 and 3.58 (logarithmic mean 0.743). For the frozen/thawed transfers, this value was 1.08. In the 9-12 week group containing 6 cases of IVF, three thawed transfers and 25 con- trols, PAPP-A was significantly redu- ced in the stimulated compared to the nonstimulated cycles. In the late first and early second trimester the difference was not significant in our small group but the trend persisted. Pregnancies after IVF will thus show an increased incidence of false positive results in fetal trisomy-21 screening, and special medians should be established for these pati- ents.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

We analyze the rate of convergence towards self-similarity for the subcritical Keller-Segel system in the radially symmetric two-dimensional case and in the corresponding one-dimensional case for logarithmic interaction. We measure convergence in Wasserstein distance. The rate of convergence towards self-similarity does not degenerate as we approach the critical case. As a byproduct, we obtain a proof of the logarithmic Hardy-Littlewood-Sobolev inequality in the one dimensional and radially symmetric two dimensional case based on optimal transport arguments. In addition we prove that the onedimensional equation is a contraction with respect to Fourier distance in the subcritical case.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

This paper is to examine the proper use of dimensions and curve fitting practices elaborating on Georgescu-Roegen’s economic methodology in relation to the three main concerns of his epistemological orientation. Section 2 introduces two critical issues in relation to dimensions and curve fitting practices in economics in view of Georgescu-Roegen’s economic methodology. Section 3 deals with the logarithmic function (ln z) and shows that z must be a dimensionless pure number, otherwise it is nonsensical. Several unfortunate examples of this analytical error are presented including macroeconomic data analysis conducted by a representative figure in this field. Section 4 deals with the standard Cobb-Douglas function. It is shown that the operational meaning cannot be obtained for capital or labor within the Cobb-Douglas function. Section 4 also deals with economists "curve fitting fetishism". Section 5 concludes thispaper with several epistemological issues in relation to dimensions and curve fitting practices in economics.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

In this paper the two main drawbacks of the heat balance integral methods are examined. Firstly we investigate the choice of approximating function. For a standard polynomial form it is shown that combining the Heat Balance and Refined Integral methods to determine the power of the highest order term will either lead to the same, or more often, greatly improved accuracy on standard methods. Secondly we examine thermal problems with a time-dependent boundary condition. In doing so we develop a logarithmic approximating function. This new function allows us to model moving peaks in the temperature profile, a feature that previous heat balance methods cannot capture. If the boundary temperature varies so that at some time t & 0 it equals the far-field temperature, then standard methods predict that the temperature is everywhere at this constant value. The new method predicts the correct behaviour. It is also shown that this function provides even more accurate results, when coupled with the new CIM, than the polynomial profile. Analysis primarily focuses on a specified constant boundary temperature and is then extended to constant flux, Newton cooling and time dependent boundary conditions.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

BACKGROUND: Shared Decision Making (SDM) is increasingly advocated as a model for medical decision making. However, there is still low use of SDM in clinical practice. High impact factor journals might represent an efficient way for its dissemination. We aimed to identify and characterize publication trends of SDM in 15 high impact medical journals. METHODS: We selected the 15 general and internal medicine journals with the highest impact factor publishing original articles, letters and editorials. We retrieved publications from 1996 to 2011 through the full-text search function on each journal website and abstracted bibliometric data. We included publications of any type containing the phrase "shared decision making" or five other variants in their abstract or full text. These were referred to as SDM publications. A polynomial Poisson regression model with logarithmic link function was used to assess the evolution across the period of the number of SDM publications according to publication characteristics. RESULTS: We identified 1285 SDM publications out of 229,179 publications in 15 journals from 1996 to 2011. The absolute number of SDM publications by journal ranged from 2 to 273 over 16 years. SDM publications increased both in absolute and relative numbers per year, from 46 (0.32% relative to all publications from the 15 journals) in 1996 to 165 (1.17%) in 2011. This growth was exponential (P < 0.01). We found fewer research publications (465, 36.2% of all SDM publications) than non-research publications, which included non-systematic reviews, letters, and editorials. The increase of research publications across time was linear. Full-text search retrieved ten times more SDM publications than a similar PubMed search (1285 vs. 119 respectively). CONCLUSION: This review in full-text showed that SDM publications increased exponentially in major medical journals from 1996 to 2011. This growth might reflect an increased dissemination of the SDM concept to the medical community.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Many multivariate methods that are apparently distinct can be linked by introducing oneor more parameters in their definition. Methods that can be linked in this way arecorrespondence analysis, unweighted or weighted logratio analysis (the latter alsoknown as "spectral mapping"), nonsymmetric correspondence analysis, principalcomponent analysis (with and without logarithmic transformation of the data) andmultidimensional scaling. In this presentation I will show how several of thesemethods, which are frequently used in compositional data analysis, may be linkedthrough parametrizations such as power transformations, linear transformations andconvex linear combinations. Since the methods of interest here all lead to visual mapsof data, a "movie" can be made where where the linking parameter is allowed to vary insmall steps: the results are recalculated "frame by frame" and one can see the smoothchange from one method to another. Several of these "movies" will be shown, giving adeeper insight into the similarities and differences between these methods

Relevância:

10.00% 10.00%

Publicador:

Resumo:

In this paper we examine the problem of compositional data from a different startingpoint. Chemical compositional data, as used in provenance studies on archaeologicalmaterials, will be approached from the measurement theory. The results will show, in avery intuitive way that chemical data can only be treated by using the approachdeveloped for compositional data. It will be shown that compositional data analysis is aparticular case in projective geometry, when the projective coordinates are in thepositive orthant, and they have the properties of logarithmic interval metrics. Moreover,it will be shown that this approach can be extended to a very large number ofapplications, including shape analysis. This will be exemplified with a case study inarchitecture of Early Christian churches dated back to the 5th-7th centuries AD

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Background. DNA-damage assays, quantifying the initial number of DNA double-strand breaks induced by radiation, have been proposed as a predictive test for radiation-induced toxicity. Determination of radiation-induced apoptosis in peripheral blood lymphocytes by flow cytometry analysis has also been proposed as an approach for predicting normal tissue responses following radiotherapy. The aim of the present study was to explore the association between initial DNA damage, estimated by the number of double-strand breaks induced by a given radiation dose, and the radio-induced apoptosis rates observed. Methods. Peripheral blood lymphocytes were taken from 26 consecutive patients with locally advanced breast carcinoma. Radiosensitivity of lymphocytes was quantified as the initial number of DNA double-strand breaks induced per Gy and per DNA unit (200 Mbp). Radio-induced apoptosis at 1, 2 and 8 Gy was measured by flow cytometry using annexin V/propidium iodide. Results. Radiation-induced apoptosis increased in order to radiation dose and data fitted to a semi logarithmic mathematical model. A positive correlation was found among radio-induced apoptosis values at different radiation doses: 1, 2 and 8 Gy (p < 0.0001 in all cases). Mean DSB/Gy/DNA unit obtained was 1.70 ± 0.83 (range 0.63-4.08; median, 1.46). A statistically significant inverse correlation was found between initial damage to DNA and radio-induced apoptosis at 1 Gy (p = 0.034). A trend toward 2 Gy (p = 0.057) and 8 Gy (p = 0.067) was observed after 24 hours of incubation. Conclusions. An inverse association was observed for the first time between these variables, both considered as predictive factors to radiation toxicity.