61 resultados para Eddy current testing
Resumo:
La validació de mètodes és un dels pilars fonamentals de l’assegurament de la qualitat en els laboratoris d’anàlisi, tal i com queda reflectit en la norma ISO/IEC 17025. És, per tant, un aspecte que cal abordar en els plans d’estudis dels presents i dels futurs graus en Química. Existeix molta bibliografia relativa a la validació de mètodes, però molt sovint aquesta s’utilitza poc, degut a la dificultat manifesta de processar tota la informació disponible i aplicar-la al laboratori i als problemes concrets. Una altra de les limitacions en aquest camps és la manca de programaris adaptats a les necessitats del laboratori. Moltes de les rutines estadístiques que es fan servir en la validació de mètodes són adaptacions fetes amb Microsoft Excel o venen incorporades en paquets estadístics gegants, amb un alt grau de complexitat. És per aquest motiu que l’objectiu del projecte ha estat generar un programari per la validació de mètodes i l’assegurament de la qualitat dels resultats analítics, que incorporés únicament les rutines necessàries. Específicament, el programari incorpora les funcions estadístiques necessàries per a verificar l’exactitud i avaluar la precisió d’un mètode analític. El llenguatge de programació triat ha estat el Java en la seva versió 6. La part de creació del programari ha constat de les següents etapes: recollida de requisits, anàlisi dels requisits, disseny del programari en mòduls, programació d les funcions del programa i de la interfície gràfica, creació de tests d’integració i prova amb usuaris reals, i, finalment, la posada en funcionament del programari (creació de l’instal·lador i distribució del programari).
Resumo:
We derive necessary and sufficient conditions under which a set of variables is informationally sufficient, i.e. it contains enough information to estimate the structural shocks with a VAR model. Based on such conditions, we suggest a procedure to test for informational sufficiency. Moreover, we show how to amend the VAR if informational sufficiency is rejected. We apply our procedure to a VAR including TFP, unemployment and per-capita hours worked. We find that the three variables are not informationally sufficient. When adding missing information, the effects of technology shocks change dramatically.
Resumo:
El objetivo del proyecto es el desarrollo de una herramienta de trabajo para un departamento de Calidad. A través de ella, se deben poder ejecutar unos test automatizados sobre unas funcionalidades que tiene la aplicación Logic Class: el Cálculo de Nómina y Seguros Sociales.
Resumo:
En el projecte s’ha dut a terme un estudi sobre la tecnologia que aporten les targetes gràfiques (GPU) dins l’àmbit de programació d’aplicacions que tradicionalment eren executades en la CPU o altrament conegut com a GPGPU. S’ha fet una anàlisi profunda del marc tecnològic actual explicant part del maquinari de les targetes gràfiques i de què tracta el GPGPU. També s’han estudiat les diferents opcions que existeixen per poder realitzar els tests de rendiment que permetran avaluar el programari, quin programari està dissenyat per ser executat amb aquesta tecnologia i quin és el procediment a seguir per poder utilitzar-los. S’han efectuat diverses proves per avaluar el rendiment de programari dissenyat o compatible d’executar en la GPU, realitzant taules comparatives amb els temps de còmput. Un cop finalitzades les diferents proves del programari, es pot concloure que no tota aplicació processada en la GPU aporta un benefici. Per poder veure millores és necessari que l’aplicació reuneixi una sèrie de requisits com que disposi d’un elevat nombre d’operacions que es puguin realitzar en paral lel, que no existeixin condicionants per a l’execució de les operacions i que sigui un procés amb càlcul aritmètic intensiu.
Resumo:
This article discusses the lessons learned from developing and delivering the Vocational Management Training for the European Tourism Industry (VocMat) online training programme, which was aimed at providing flexible, online distance learning for the European tourism industry. The programme was designed to address managers ‘need for flexible, senior management level training which they could access at a time and place which fitted in with their working and non-work commitments. The authors present two main approaches to using the Virtual Learning Environment, the feedback from the participants, and the implications of online Technology in extending tourism training opportunities
Resumo:
In standard multivariate statistical analysis common hypotheses of interest concern changes in mean vectors and subvectors. In compositional data analysis it is now well established that compositional change is most readily described in terms of the simplicial operation of perturbation and that subcompositions replace the marginal concept of subvectors. To motivate the statistical developments of this paper we present two challenging compositional problems from food production processes.Against this background the relevance of perturbations and subcompositions can beclearly seen. Moreover we can identify a number of hypotheses of interest involvingthe specification of particular perturbations or differences between perturbations and also hypotheses of subcompositional stability. We identify the two problems as being the counterpart of the analysis of paired comparison or split plot experiments and of separate sample comparative experiments in the jargon of standard multivariate analysis. We then develop appropriate estimation and testing procedures for a complete lattice of relevant compositional hypotheses
Resumo:
El presente trabajo de investigación doctoral reflexiona sobre el aporte de las prácticas artísticas contemporáneas en los procesos de diálogo entre culturas en el Mediterráneo. En las dinámicas de percepción, acercamiento y (re)conocimiento del otro y de las culturas otras, todos los medios informativos y de comunicación pueden ser, por un lado, válidos y efectivos como, por otro, pueden desviar de la verdad y difundir fácilmente numerosos prejuicios. Frente a la “síntesis” operada por los medios de comunicación tradicionales, donde “el Otro” es a menudo objeto de representaciones estereotipadas, las prácticas artísticas, en su sentido más amplio, contraponen un “análisis” que puede permitir, en diferentes casos, la eliminación de las barreras y un mayor acercamiento a la cultura otra. En la presente investigación se quiere trazar un itinerario teórico que parte de las hipótesis orientalistas, pasa por las filosofías del encuentro con el otro; los estudios sobre la participación; las actuales reflexiones sobre la estética relacional; las definiciones históricas y socio-políticas del Mediterráneo; hasta llegar a reconocer en el arte, en la creación artística, el medio de comunicación más eficaz y penetrante a la hora de refinar el conocimiento y (re)descubrir los diferentes aspectos de las culturas de los países y de los pueblos del Mediterráneo. En el espacio plural del Mediterráneo las prácticas artísticas y creativas actuales actúan como un campo de experimentación y un territorio de debate valioso e importante para el desarrollo del discurso intercultural.
Resumo:
Quantitative or algorithmic trading is the automatization of investments decisions obeying a fixed or dynamic sets of rules to determine trading orders. It has increasingly made its way up to 70% of the trading volume of one of the biggest financial markets such as the New York Stock Exchange (NYSE). However, there is not a signi cant amount of academic literature devoted to it due to the private nature of investment banks and hedge funds. This projects aims to review the literature and discuss the models available in a subject that publications are scarce and infrequently. We review the basic and fundamental mathematical concepts needed for modeling financial markets such as: stochastic processes, stochastic integration and basic models for prices and spreads dynamics necessary for building quantitative strategies. We also contrast these models with real market data with minutely sampling frequency from the Dow Jones Industrial Average (DJIA). Quantitative strategies try to exploit two types of behavior: trend following or mean reversion. The former is grouped in the so-called technical models and the later in the so-called pairs trading. Technical models have been discarded by financial theoreticians but we show that they can be properly cast into a well defined scientific predictor if the signal generated by them pass the test of being a Markov time. That is, we can tell if the signal has occurred or not by examining the information up to the current time; or more technically, if the event is F_t-measurable. On the other hand the concept of pairs trading or market neutral strategy is fairly simple. However it can be cast in a variety of mathematical models ranging from a method based on a simple euclidean distance, in a co-integration framework or involving stochastic differential equations such as the well-known Ornstein-Uhlenbeck mean reversal ODE and its variations. A model for forecasting any economic or financial magnitude could be properly defined with scientific rigor but it could also lack of any economical value and be considered useless from a practical point of view. This is why this project could not be complete without a backtesting of the mentioned strategies. Conducting a useful and realistic backtesting is by no means a trivial exercise since the \laws" that govern financial markets are constantly evolving in time. This is the reason because we make emphasis in the calibration process of the strategies' parameters to adapt the given market conditions. We find out that the parameters from technical models are more volatile than their counterpart form market neutral strategies and calibration must be done in a high-frequency sampling manner to constantly track the currently market situation. As a whole, the goal of this project is to provide an overview of a quantitative approach to investment reviewing basic strategies and illustrating them by means of a back-testing with real financial market data. The sources of the data used in this project are Bloomberg for intraday time series and Yahoo! for daily prices. All numeric computations and graphics used and shown in this project were implemented in MATLAB^R scratch from scratch as a part of this thesis. No other mathematical or statistical software was used.
Resumo:
In the eighties, John Aitchison (1986) developed a new methodological approach for the statistical analysis of compositional data. This new methodology was implemented in Basic routines grouped under the name CODA and later NEWCODA inMatlab (Aitchison, 1997). After that, several other authors have published extensions to this methodology: Marín-Fernández and others (2000), Barceló-Vidal and others (2001), Pawlowsky-Glahn and Egozcue (2001, 2002) and Egozcue and others (2003). (...)
Resumo:
Malgrat la rellevància estratègica i el paper desestabilitzador de Corea del Nord a la regió econòmicament més dinàmica del món, la UE no compta amb cap estratègia clara per involucrar-se amb aquest país. Combinant tècniques d’anàlisi qualitatives i quantitatives, aquest treball pretén descobrir possibles contradiccions internes que impedeixin la definició d'una política exterior europea coherent i efectiva amb respecte a Corea del Nord, així com discrepàncies entre les percepcions d’actors interns de la UE i les d’actors externs. S'han detectat importants diferències d’expectatives i mancances en termes de coherència, tant entre les visions expressades pels actors interns com entre les opinions d’aquests actors i les dels futurs líders sudcoreans enquestats – diferències que fins i tot afecten la promoció dels drets humans
Resumo:
L’informe que es presenta en aquest llibre és el resultat d’un nou acord de col·laboració entre el Programa de les Nacions Unides per als Assentaments Humans (ONU-Habitat) i l’Institut de Seguretat Pública de Catalunya, impulsat amb l’objectiu de millorar la seguretat en esdeveniments públics en els espais urbans a l’Àfrica. La fase pilot es va dur a terme el 2010, durant els dos seminaris de formació realitzats a Mollet del Vallès (Barcelona) com a part de la Plataforma Policia per al Desenvolupament Urbà (PPUD). En aquest informe es descriuen els orígens i l’estat de la iniciativa i resumeix els resultats. També s’inclouen algunes recomanacions per a millorar la seguretat d’esdeveniments públics. Font d'informació: http://www.onuhabitat.org.
Resumo:
Rat superior cervical ganglion (SCG) neurons express low-threshold noninactivating M-type potassium channels (I-K(M)), which can be inhibited by activation of M-1 muscarinic receptors (M-1 mAChR) and bradykinin (BK) B-2 receptors. Inhibition by the M1 mAChR agonist oxotremorine methiodide (Oxo-M) is mediated, at least in part, by the pertussis toxin-insensitive G-protein G alpha (q) (Caulfield et al., 1994; Haley et al., 1998a), whereas BK inhibition involves G alpha (q) and/or G alpha (11) (Jones et al., 1995). G alpha (q) and G alpha (11) can stimulate phospholipase C-beta (PLC-beta), raising the possibility that PLC is involved in I-K(M) inhibition by Oxo-M and BK. RT-PCR and antibody staining confirmed the presence of PLC-beta1, - beta2, - beta3, and - beta4 in rat SCG. We have tested the role of two PLC isoforms (PLC-beta1 and PLC-beta4) using antisense-expression constructs. Antisense constructs, consisting of the cytomegalovirus promoter driving antisense cRNA corresponding to the 3'-untranslated regions of PLC-beta1 and PLC-beta4, were injected into the nucleus of dissociated SCG neurons. Injected cells showed reduced antibody staining for the relevant PLC-beta isoform when compared to uninjected cells 48 hr later. BK inhibition of I-K(M) was significantly reduced 48 hr after injection of the PLC-beta4, but not the PLC-beta1, antisense-encoding plasmid. Neither PLC-beta antisense altered M-1 mAChR inhibition by Oxo-M. These data support the conclusion of Cruzblanca et al. (1998) that BK, but not M-1 mAChR, inhibition of I-K(M) involves PLC and extends this finding by indicating that PLC-beta4 is involved.
Resumo:
Background: Recent advances on high-throughput technologies have produced a vast amount of protein sequences, while the number of high-resolution structures has seen a limited increase. This has impelled the production of many strategies to built protein structures from its sequence, generating a considerable amount of alternative models. The selection of the closest model to the native conformation has thus become crucial for structure prediction. Several methods have been developed to score protein models by energies, knowledge-based potentials and combination of both.Results: Here, we present and demonstrate a theory to split the knowledge-based potentials in scoring terms biologically meaningful and to combine them in new scores to predict near-native structures. Our strategy allows circumventing the problem of defining the reference state. In this approach we give the proof for a simple and linear application that can be further improved by optimizing the combination of Zscores. Using the simplest composite score () we obtained predictions similar to state-of-the-art methods. Besides, our approach has the advantage of identifying the most relevant terms involved in the stability of the protein structure. Finally, we also use the composite Zscores to assess the conformation of models and to detect local errors.Conclusion: We have introduced a method to split knowledge-based potentials and to solve the problem of defining a reference state. The new scores have detected near-native structures as accurately as state-of-art methods and have been successful to identify wrongly modeled regions of many near-native conformations.