969 resultados para PARAMETRIC TRANSDUCERS


Relevância:

10.00% 10.00%

Publicador:

Resumo:

BACKGROUND Type 2 diabetes mellitus (T2DM) is an emerging risk factor for cognitive impairment. Whether this impairment is a direct effect of this metabolic disorder on brain function, a consequence of vascular disease, or both, remains unknown. Structural and functional neuroimaging studies in patients with T2DM could help to elucidate this question. OBJECTIVE We designed a cross-sectional study comparing 25 T2DM patients with 25 age- and gender-matched healthy control participants. Clinical information, APOE genotype, lipid and glucose analysis, structural cerebral magnetic resonance imaging including voxel-based morphometry, and F-18 fluorodeoxyglucose positron emission tomography were obtained in all subjects. METHODS Gray matter densities and metabolic differences between groups were analyzed using statistical parametric mapping. In addition to comparing the neuroimaging profiles of both groups, we correlated neuroimaging findings with HbA1c levels, duration of T2DM, and insulin resistance measurement (HOMA-IR) in the diabetic patients group. Results: Patients with T2DM presented reduced gray matter densities and reduced cerebral glucose metabolism in several fronto-temporal brain regions after controlling for various vascular risk factors. Furthermore, within the T2DM group, longer disease duration, and higher HbA1c levels and HOMA-IR were associated with lower gray matter density and reduced cerebral glucose metabolism in fronto-temporal regions. CONCLUSION In agreement with previous reports, our findings indicate that T2DM leads to structural and metabolic abnormalities in fronto-temporal areas. Furthermore, they suggest that these abnormalities are not entirely explained by the role of T2DM as a cardiovascular risk factor.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

The main result of this work is a parametric description of the spectral surfaces of a class of periodic 5-diagonal matrices, related to the strong moment problem. This class is a self-adjoint twin of the class of CMV matrices. Jointly they form the simplest possible classes of 5-diagonal matrices.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

As stated in Aitchison (1986), a proper study of relative variation in a compositional data set should be based on logratios, and dealing with logratios excludes dealing with zeros. Nevertheless, it is clear that zero observations might be present in real data sets, either because the corresponding part is completelyabsent –essential zeros– or because it is below detection limit –rounded zeros. Because the second kind of zeros is usually understood as “a trace too small to measure”, it seems reasonable to replace them by a suitable small value, and this has been the traditional approach. As stated, e.g. by Tauber (1999) and byMartín-Fernández, Barceló-Vidal, and Pawlowsky-Glahn (2000), the principal problem in compositional data analysis is related to rounded zeros. One should be careful to use a replacement strategy that does not seriously distort the general structure of the data. In particular, the covariance structure of the involvedparts –and thus the metric properties– should be preserved, as otherwise further analysis on subpopulations could be misleading. Following this point of view, a non-parametric imputation method isintroduced in Martín-Fernández, Barceló-Vidal, and Pawlowsky-Glahn (2000). This method is analyzed in depth by Martín-Fernández, Barceló-Vidal, and Pawlowsky-Glahn (2003) where it is shown that thetheoretical drawbacks of the additive zero replacement method proposed in Aitchison (1986) can be overcome using a new multiplicative approach on the non-zero parts of a composition. The new approachhas reasonable properties from a compositional point of view. In particular, it is “natural” in the sense thatit recovers the “true” composition if replacement values are identical to the missing values, and it is coherent with the basic operations on the simplex. This coherence implies that the covariance structure of subcompositions with no zeros is preserved. As a generalization of the multiplicative replacement, in thesame paper a substitution method for missing values on compositional data sets is introduced

Relevância:

10.00% 10.00%

Publicador:

Resumo:

There is almost not a case in exploration geology, where the studied data doesn’tincludes below detection limits and/or zero values, and since most of the geological dataresponds to lognormal distributions, these “zero data” represent a mathematicalchallenge for the interpretation.We need to start by recognizing that there are zero values in geology. For example theamount of quartz in a foyaite (nepheline syenite) is zero, since quartz cannot co-existswith nepheline. Another common essential zero is a North azimuth, however we canalways change that zero for the value of 360°. These are known as “Essential zeros”, butwhat can we do with “Rounded zeros” that are the result of below the detection limit ofthe equipment?Amalgamation, e.g. adding Na2O and K2O, as total alkalis is a solution, but sometimeswe need to differentiate between a sodic and a potassic alteration. Pre-classification intogroups requires a good knowledge of the distribution of the data and the geochemicalcharacteristics of the groups which is not always available. Considering the zero valuesequal to the limit of detection of the used equipment will generate spuriousdistributions, especially in ternary diagrams. Same situation will occur if we replace thezero values by a small amount using non-parametric or parametric techniques(imputation).The method that we are proposing takes into consideration the well known relationshipsbetween some elements. For example, in copper porphyry deposits, there is always agood direct correlation between the copper values and the molybdenum ones, but whilecopper will always be above the limit of detection, many of the molybdenum values willbe “rounded zeros”. So, we will take the lower quartile of the real molybdenum valuesand establish a regression equation with copper, and then we will estimate the“rounded” zero values of molybdenum by their corresponding copper values.The method could be applied to any type of data, provided we establish first theircorrelation dependency.One of the main advantages of this method is that we do not obtain a fixed value for the“rounded zeros”, but one that depends on the value of the other variable.Key words: compositional data analysis, treatment of zeros, essential zeros, roundedzeros, correlation dependency

Relevância:

10.00% 10.00%

Publicador:

Resumo:

I use a multi-layer feedforward perceptron, with backpropagation learning implemented via stochastic gradient descent, to extrapolate the volatility smile of Euribor derivatives over low-strikes by training the network on parametric prices.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Initial topography and inherited structural discontinuities are known to play a dominant role in rock slope stability. Previous 2-D physical modeling results demonstrated that even if few preexisting fractures are activated/propagated during gravitational failure all of those heterogeneities had a great influence on mobilized volume and its kinematics. The question we address in the present study is to determine if such a result is also observed in 3-D. As in 2-D previous models we examine geologically stable model configuration, based upon the well documented landslide at Randa, Switzerland. The 3-D models consisted of a homogeneous material in which several fracture zones were introduced in order to study simplified but realistic configurations of discontinuities (e.g. based on natural example rather than a parametric study). Results showed that the type of gravitational failure (deep-seated landslide or sequential failure) and resulting slope morphology evolution are the result of the interplay of initial topography and inherited preexisting fractures (orientation and density). The three main results are i) the initial topography exerts a strong control on gravitational slope failure. Indeed in each tested configuration (even in the isotropic one without fractures) the model is affected by a rock slide, ii) the number of simulated fracture sets greatly influences the volume mobilized and its kinematics, and iii) the failure zone involved in the 1991 event is smaller than the results produced by the analog modeling. This failure may indicate that the zone mobilized in 1991 is potentially only a part of a larger deep-seated landslide and/or wider deep seated gravitational slope deformation.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

ABSTRACTThe Copula Theory was used to analyze contagion among the BRIC (Brazil, Russia, India and China) and European Union stock markets with the U.S. Equity Market. The market indexes used for the period between January 01, 2005 and February 27, 2010 are: MXBRIC (BRIC), MXEU (European Union) and MXUS (United States). This article evaluated the adequacy of the main copulas found in the financial literature using log-likelihood, Akaike information and Bayesian information criteria. This article provides a groundbreaking study in the area of contagion due to the use of conditional copulas, allowing to calculate the correlation increase between indexes with non-parametric approach. The conditional Symmetrized Joe-Clayton copula was the one that fitted better to the considered pairs of returns. Results indicate evidence of contagion effect in both markets, European Union and BRIC members, with a 5% significance level. Furthermore, there is also evidence that the contagion of U.S. financial crisis was more pronounced in the European Union than in the BRIC markets, with a 5% significance level. Therefore, stock portfolios formed by equities from the BRIC countries were able to offer greater protection during the subprime crisis. The results are aligned with recent papers that present an increase in correlation between stock markets, especially in bear markets.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

SummaryDiscrete data arise in various research fields, typically when the observations are count data.I propose a robust and efficient parametric procedure for estimation of discrete distributions. The estimation is done in two phases. First, a very robust, but possibly inefficient, estimate of the model parameters is computed and used to indentify outliers. Then the outliers are either removed from the sample or given low weights, and a weighted maximum likelihood estimate (WML) is computed.The weights are determined via an adaptive process such that if the data follow the model, then asymptotically no observation is downweighted.I prove that the final estimator inherits the breakdown point of the initial one, and that its influence function at the model is the same as the influence function of the maximum likelihood estimator, which strongly suggests that it is asymptotically fully efficient.The initial estimator is a minimum disparity estimator (MDE). MDEs can be shown to have full asymptotic efficiency, and some MDEs have very high breakdown points and very low bias under contamination. Several initial estimators are considered, and the performances of the WMLs based on each of them are studied.It results that in a great variety of situations the WML substantially improves the initial estimator, both in terms of finite sample mean square error and in terms of bias under contamination. Besides, the performances of the WML are rather stable under a change of the MDE even if the MDEs have very different behaviors.Two examples of application of the WML to real data are considered. In both of them, the necessity for a robust estimator is clear: the maximum likelihood estimator is badly corrupted by the presence of a few outliers.This procedure is particularly natural in the discrete distribution setting, but could be extended to the continuous case, for which a possible procedure is sketched.RésuméLes données discrètes sont présentes dans différents domaines de recherche, en particulier lorsque les observations sont des comptages.Je propose une méthode paramétrique robuste et efficace pour l'estimation de distributions discrètes. L'estimation est faite en deux phases. Tout d'abord, un estimateur très robuste des paramètres du modèle est calculé, et utilisé pour la détection des données aberrantes (outliers). Cet estimateur n'est pas nécessairement efficace. Ensuite, soit les outliers sont retirés de l'échantillon, soit des faibles poids leur sont attribués, et un estimateur du maximum de vraisemblance pondéré (WML) est calculé.Les poids sont déterminés via un processus adaptif, tel qu'asymptotiquement, si les données suivent le modèle, aucune observation n'est dépondérée.Je prouve que le point de rupture de l'estimateur final est au moins aussi élevé que celui de l'estimateur initial, et que sa fonction d'influence au modèle est la même que celle du maximum de vraisemblance, ce qui suggère que cet estimateur est pleinement efficace asymptotiquement.L'estimateur initial est un estimateur de disparité minimale (MDE). Les MDE sont asymptotiquement pleinement efficaces, et certains d'entre eux ont un point de rupture très élevé et un très faible biais sous contamination. J'étudie les performances du WML basé sur différents MDEs.Le résultat est que dans une grande variété de situations le WML améliore largement les performances de l'estimateur initial, autant en terme du carré moyen de l'erreur que du biais sous contamination. De plus, les performances du WML restent assez stables lorsqu'on change l'estimateur initial, même si les différents MDEs ont des comportements très différents.Je considère deux exemples d'application du WML à des données réelles, où la nécessité d'un estimateur robuste est manifeste : l'estimateur du maximum de vraisemblance est fortement corrompu par la présence de quelques outliers.La méthode proposée est particulièrement naturelle dans le cadre des distributions discrètes, mais pourrait être étendue au cas continu.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Phototropism, or plant growth in response to unidirectional light, is an adaptive response of crucial importance. Lateral differences in low fluence rates of blue light are detected by phototropin 1 (phot1) in Arabidopsis. Only NONPHOTOTROPIC HYPOCOTYL 3 (NPH3) and root phototropism 2, both belonging to the same family of proteins, have been previously identified as phototropin-interacting signal transducers involved in phototropism. PHYTOCHROME KINASE SUBSTRATE (PKS) 1 and PKS2 are two phytochrome signaling components belonging to a small gene family in Arabidopsis (PKS1-PKS4). The strong enhancement of PKS1 expression by blue light and its light induction in the elongation zone of the hypocotyl prompted us to study the function of this gene family during phototropism. Photobiological experiments show that the PKS proteins are critical for hypocotyl phototropism. Furthermore, PKS1 interacts with phot1 and NPH3 in vivo at the plasma membrane and in vitro, indicating that the PKS proteins may function directly with phot1 and NPH3 to mediate phototropism. The phytochromes are known to influence phototropism but the mechanism involved is still unclear. We show that PKS1 induction by a pulse of blue light is phytochrome A-dependent, suggesting that the PKS proteins may provide a molecular link between these two photoreceptor families.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

A new plastic self-expanding Smartcanula (Smartcanula LLC, Lausanne, Switzerland) is designed for central insertion and prevention of caval collapse. The objective of our work is to assess the influence of the new design on atrial chatter. Caval collapse over the entire caval axis, right atrial, hepatic, renal vein, and iliac vein is realized in drainage tubes with holes at 5 cm distance intervals. Smartcanulas with various lengths (26 cm [= right atrial], 34 cm [= hepatic], 43 cm [= renal], and 53 cm [= iliac]) versus two-stage cannulas are compared. Pressure drop (ΔP) is measured using Millar pressure-transducers. Flow rate (Q) is measured using an ultrasonic flow meter. Cannula resistance is defined as the ΔP/Q ratio. Data display and recording are controlled using LabView virtual instruments. At an 88 cm height differential, Q values are 8.69 and 6.8 l/min, and ΔP/Q ratios are 0.63 and 1.28 for the 26-cm Smartcanula and the reference cannula, respectively. The 34-cm Smartcanula showed 8.89 l/min and 0.6 ΔP/Q ratio vs. 7.59 l/min and 0.9 for the control cannula (P < 0.05). The 43-cm and 53-cm Smartcanulas showed Q values of 9.04 and 8.81 l/min, respectively, and ΔP/Q2 ratio of 0.6. The Smartcanula outperforms the two-stage cannula, and direct cannula insertion without guide wire is effective.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

En els darrers anys, els sistemes de telemetria per a aplicacions mèdiques han crescut significativament en el diagnòstic i en la monitorització de, per exemple, la glucosa, la pressió de la sang, la temperatura, el ritme cardíac... Els dispositius implantats amplien les aplicacions en medicina i incorpora una millora de qualitat de vida per a l’usuari. Per aquest motiu, en aquest projecte s’estudien dues de les antenes més comuns, com son l’antena dipol i el patch, aquesta última és especialment utilitzada en aplicacions implantades. En l’anàlisi d’aquestes antenes s’han parametritzat característiques relacionades amb l’entorn de l’aplicació, així com també de la pròpia antena, explicant el comportament que, a diferencia amb l’espai lliure, les antenes presenten a canvis d’aquests paràmetres. Al mateix temps, s’ha implementat una configuració per a la mesura d’antenes implantades basat en el model del cos humà d’una capa. Comparant amb els resultats de les simulacions realitzades mitjançant el software FEKO, s’ha obtingut gran correspondència en la mesura empírica d’adaptació i de guany de les antenes microstrip. Gràcies a l’anàlisi paramètric, aquest projecte també presenta diversos dissenys de les antenes optimitzant el guany realitzable amb l’objectiu d’aconseguir la millor comunicació possible amb el dispositiu extern o estació base.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

This paper investigates a simple procedure to estimate robustly the mean of an asymmetric distribution. The procedure removes the observations which are larger or smaller than certain limits and takes the arithmetic mean of the remaining observations, the limits being determined with the help of a parametric model, e.g., the Gamma, the Weibull or the Lognormal distribution. The breakdown point, the influence function, the (asymptotic) variance, and the contamination bias of this estimator are explored and compared numerically with those of competing estimates.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Genetically engineered organisms expressing spectroscopically active reporter molecules in response to chemical effectors display great potential as living transducers in sensing applications. Green fluorescent protein (gfp gene) bioreporters have distinct advantages over luminescent couterparts (lux gene), including applicability at the single-cell level, but are typically less sensitive. Here we describe a gfp-bearing bioreporter that is sensitive to naphthalene (a poorly water soluble pollutant behaving like a large class of hydrophobic compounds), is suitable for use in chemical assays and bioavailability studies, and has detection limits comparable to lux-bearing bioreporters for higher efficiency detection strategies. Simultaneously, we find that the exploitation of population response data from single-cell analysis is not an algorithmic conduit to enhanced signal detection and hence lower effector detection limits, as normally assumed. The assay reported functions to equal effect with or without biocide.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Time scale parametric spike train distances like the Victor and the van Rossum distancesare often applied to study the neural code based on neural stimuli discrimination.Different neural coding hypotheses, such as rate or coincidence coding,can be assessed by combining a time scale parametric spike train distance with aclassifier in order to obtain the optimal discrimination performance. The time scalefor which the responses to different stimuli are distinguished best is assumed to bethe discriminative precision of the neural code. The relevance of temporal codingis evaluated by comparing the optimal discrimination performance with the oneachieved when assuming a rate code.We here characterize the measures quantifying the discrimination performance,the discriminative precision, and the relevance of temporal coding. Furthermore,we evaluate the information these quantities provide about the neural code. Weshow that the discriminative precision is too unspecific to be interpreted in termsof the time scales relevant for encoding. Accordingly, the time scale parametricnature of the distances is mainly an advantage because it allows maximizing thediscrimination performance across a whole set of measures with different sensitivitiesdetermined by the time scale parameter, but not due to the possibility toexamine the temporal properties of the neural code.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

This paper describes a maximum likelihood method using historical weather data to estimate a parametric model of daily precipitation and maximum and minimum air temperatures. Parameter estimates are reported for Brookings, SD, and Boone, IA, to illustrate the procedure. The use of this parametric model to generate stochastic time series of daily weather is then summarized. A soil temperature model is described that determines daily average, maximum, and minimum soil temperatures based on air temperatures and precipitation, following a lagged process due to soil heat storage and other factors.