20 resultados para method development
em Consorci de Serveis Universitaris de Catalunya (CSUC), Spain
Resumo:
This paper reports the method development for the simultaneous determination of methylmercury MeHgþ) and inorganic mercury (iHg) species in seafood samples. The study focused on the extraction and quantification of MeHgþ (the most toxic species) by liquid chromatography coupled to on-line UV irradiation and cold vapour atomic fluorescence spectroscopy (LC-UV-CV-AFS), using HCl 4 mol/L as the extractant agent. Accuracy of the method has been verified by analysing three certified reference materials and different spiked samples. The values found for total Hg and MeHgþ for the CRMs did not differ significantly from certified values at a 95% confidence level, and recoveries between 85% and 97% for MeHgþ, based on spikes, were achieved. The detection limits (LODs) obtained were 0.001 mg Hg/kg for total mercury, 0.0003 mg Hg/kg for MeHgþ and 0.0004 mg Hg/kg for iHg. The quantification limits (LOQs) established were 0.003 mg Hg/kg for total mercury, 0.0010 mg Hg/kg for MeHgþ and 0.0012 mg Hg/kg for iHg. Precision for each mercury species was established, being 12% in terms of RSD in all cases. Finally, the developed method was applied to 24 seafood samples from different origins and total mercury contents. The concentrations for Total Hg, MeHg and iHg ranged from 0.07 to 2.33, 0.003-2.23 and 0.006-0.085 mg Hg/kg, respectively. The established analytical method allows to obtain results for mercury speciation in less than 1 one hour including both, sample pretreatment and measuring step.
Resumo:
Report for the scientific sojourn carried out in the International Center for Numerical Methods in Engineering (CIMNE) –state agency – from February until November 2007. The work within the project Technology innovation in underground construction can be grouped into the following tasks: development of the software for modelling underground excavation based on the discrete element method - the numerical algorithms have been implemented in the computer programs and applied to simulation of excavation using roadheaders and TBM-s -; coupling of the discrete element method with the finite element method; development of the numerical model of rock cutting taking into account of wear of rock cutting tools -this work considers a very important factor influencing effectiveness of underground works -.
Resumo:
Per poder desenvolupar un producte farmacèutic és necessari establir un mètode d’anàlisis que permeti determinar i quantificar totes aquelles substàncies que conté, ja sigui referent als principis actius; a les impureses i productes de degradació, conservants, antioxidants,... Grans entitats com la ICH remarquen la importància de validar els mètodes analítics ja que és la via per demostrar que aquell producte compleix les garanties de qualitat prèviament establertes. Així doncs, l’objectiu d’aquest Treball Final de Grau és poder desenvolupar i validar dos mètodes analítics per a la determinació d’aminoàcids i carbohidrats respectivament, d’un producte farmacèutic per cromatografia líquida (HPLC). Per tal de poder concloure que aquell mètode és adequat per la determinació per la qual ha estat desenvolupat, és necessari obtenir resultats que compleixin els criteris d’acceptació corresponents als paràmetres que han de ser avaluats en una validació analítica. Aquests paràmetres són: la precisió, la selectivitat, l’exactitud i la linealitat i el rang. Els resultats d’aquest projecte han demostrat que els dos mètodes desenvolupats són adequats per a la determinació de tres dels principis actius (aminoàcid 1, aminoàcid 2 i carbohidrat 1) que conté el producte farmacèutic d’ús veterinari analitzat; i poden ser validats ja que compleixen els criteris d’acceptació dels paràmetres avaluats que proposa la ICH. El mètode per la determinació de carbohidrats no és vàlid per el carbohidrat 2, ja que durant el desenvolupament es va detectar que una bona part d’aquest passava a carbohidrat 1 (desplaçament de l’equilibri ceto-enòlic que hi ha entre el carbohidrat 1 i el carbohidrat 2 a pHs alts). És per aquest motiu, que es pot concloure que aquest mètode no és vàlid i es recomana seguir investigant per a poder desenvolupar un mètode analític adient.
Resumo:
The studies of Giacomo Becattini concerning the notion of the "Marshallian industrial district" have led a revolution in the field of economic development around the world. The paper offers an interpretation of the methodology adopted by Becattini. The roots are clearly Marshallian. Becattini proposes a return to the economy as a complex social science that operates in historical time. We adopt a Schumpeterian approach to the method in economic analysis in order to highlight the similarities between the Marshall and Becattini's approach. Finally the paper uses the distinction between logical time, real time and historical time which enable us to study the "localized" economic process in a Becattinian way.
Resumo:
The work in this paper deals with the development of momentum and thermal boundary layers when a power law fluid flows over a flat plate. At the plate we impose either constant temperature, constant flux or a Newton cooling condition. The problem is analysed using similarity solutions, integral momentum and energy equations and an approximation technique which is a form of the Heat Balance Integral Method. The fluid properties are assumed to be independent of temperature, hence the momentum equation uncouples from the thermal problem. We first derive the similarity equations for the velocity and present exact solutions for the case where the power law index n = 2. The similarity solutions are used to validate the new approximation method. This new technique is then applied to the thermal boundary layer, where a similarity solution can only be obtained for the case n = 1.
Resumo:
The blue swimmer crab is a commercially important species of the tropical Indo-Pacific regions that shows substantial potential as a candidate species for aquaculture. Optimization of larval rearing conditions, including photoperiod, is therefore important to establish a method for the intensive hatchery culture of this species. Newly hatched larvae of Portunuspelagicus in first zoeal stage (ZI) were reared under five photoperiod regimes 0L: 24D, 6L: 18D, 12L: 12D, 18L: 6D, and 24L: 0D (5 replicates per treatment) till they metamorphosed to megalopae (ranged from 8.5 ± 0.3 days (18L: 6D) to 10.8 ± 1.8 days (0L: 24D) at 29 ± 1 °C). Daily, larvae of each treatment were fed an identical diet of mixed rotifer and Artemia nauplii, and the survival and molt to successive stages was monitored. Newly hatched ZI larvae of P. pelagicus could successfully develop to the megalopal stage under all tested photoperiod conditions, but we detected significant differences in survival among treatments (p & 0.05). The constant darkness treatment (0L: 24D) had the lowest (19.2 ± 7.2%, mean ± S.E.) cumulative survival from ZI to the megalopal stage, while the 18L: 6D treatment achieved the highest survival (51.2 ± 23.6%). Similarly, the photoperiod significantly affected zoeal development. Constant darkness led to the longest cumulative zoeal duration (10.8 ± 1.8 days), whereas the 18L: 6D treatment rendered the shortest larval development (8.5 ± 0.3 days). In addition, larvae reared under constant darkness resulted in the smallest megalopae (carapace length = 1.44 ± 0.09 mm) and the lowest dry weight (0.536 ± 0.188 mg). In conclusion, photoperiod significantly affected the survival, development, and growth of P. pelagicus zoeal larvae. Constant darkness led to the lowest larval survival and developmental rate, while a photoperiod regime of 18L: 6D appeared to be the most suitable condition for the rearing of zoeal larvae of P. pelagicus.
Resumo:
This paper presents the design and implementation of QRP, an open source proof-of-concept authentication system that uses a two-factorauthentication by combining a password and a camera-equipped mobile phone, acting as an authentication token. QRP is extremely secure asall the sensitive information stored and transmitted is encrypted, but it isalso an easy to use and cost-efficient solution. QRP is portable and can be used securely in untrusted computers. Finally, QRP is able to successfully authenticate even when the phone is offline.
Resumo:
Descriptors based on Molecular Interaction Fields (MIF) are highly suitable for drug discovery, but their size (thousands of variables) often limits their application in practice. Here we describe a simple and fast computational method that extracts from a MIF a handful of highly informative points (hot spots) which summarize the most relevant information. The method was specifically developed for drug discovery, is fast, and does not require human supervision, being suitable for its application on very large series of compounds. The quality of the results has been tested by running the method on the ligand structure of a large number of ligand-receptor complexes and then comparing the position of the selected hot spots with actual atoms of the receptor. As an additional test, the hot spots obtained with the novel method were used to obtain GRIND-like molecular descriptors which were compared with the original GRIND. In both cases the results show that the novel method is highly suitable for describing ligand-receptor interactions and compares favorably with other state-of-the-art methods.
Resumo:
The Person Trade-Off (PTO) is a methodology aimed at measuring thesocial value of health states. The rest of methodologies would measure individualutility and would be less appropriate for taking resource allocation decisions.However few studies have been conducted to test the validity of the method.We present a pilot study with this objective. The study is based on theresult of interviews to 30 undergraduate students in Economics. We judgethe validity of PTO answers by their adequacy to three hypothesis of rationality.First, we show that, given certain rationality assumptions, PTO answersshould be predicted from answers to Standard Gamble questions. This firsthypothesis is not verified. The second hypothesis is that PTO answersshould not vary with different frames of equivalent PTO questions. Thissecond hypothesis is also not verified. Our third hypothesis is that PTOvalues should predict social preferences for allocating resources betweenpatients. This hypothesis is verified. The evidence on the validity of themethod is then conflicting.
Resumo:
We continue the development of a method for the selection of a bandwidth or a number of design parameters in density estimation. We provideexplicit non-asymptotic density-free inequalities that relate the $L_1$ error of the selected estimate with that of the best possible estimate,and study in particular the connection between the richness of the classof density estimates and the performance bound. For example, our methodallows one to pick the bandwidth and kernel order in the kernel estimatesimultaneously and still assure that for {\it all densities}, the $L_1$error of the corresponding kernel estimate is not larger than aboutthree times the error of the estimate with the optimal smoothing factor and kernel plus a constant times $\sqrt{\log n/n}$, where $n$ is the sample size, and the constant only depends on the complexity of the family of kernels used in the estimate. Further applications include multivariate kernel estimates, transformed kernel estimates, and variablekernel estimates.
Resumo:
Contamination of weather radar echoes by anomalous propagation (anaprop) mechanisms remains a serious issue in quality control of radar precipitation estimates. Although significant progress has been made identifying clutter due to anaprop there is no unique method that solves the question of data reliability without removing genuine data. The work described here relates to the development of a software application that uses a numerical weather prediction (NWP) model to obtain the temperature, humidity and pressure fields to calculate the three dimensional structure of the atmospheric refractive index structure, from which a physically based prediction of the incidence of clutter can be made. This technique can be used in conjunction with existing methods for clutter removal by modifying parameters of detectors or filters according to the physical evidence for anomalous propagation conditions. The parabolic equation method (PEM) is a well established technique for solving the equations for beam propagation in a non-uniformly stratified atmosphere, but although intrinsically very efficient, is not sufficiently fast to be practicable for near real-time modelling of clutter over the entire area observed by a typical weather radar. We demonstrate a fast hybrid PEM technique that is capable of providing acceptable results in conjunction with a high-resolution terrain elevation model, using a standard desktop personal computer. We discuss the performance of the method and approaches for the improvement of the model profiles in the lowest levels of the troposphere.
Resumo:
Monitoring thunderstorms activity is an essential part of operational weather surveillance given their potential hazards, including lightning, hail, heavy rainfall, strong winds or even tornadoes. This study has two main objectives: firstly, the description of a methodology, based on radar and total lightning data to characterise thunderstorms in real-time; secondly, the application of this methodology to 66 thunderstorms that affected Catalonia (NE Spain) in the summer of 2006. An object-oriented tracking procedure is employed, where different observation data types generate four different types of objects (radar 1-km CAPPI reflectivity composites, radar reflectivity volumetric data, cloud-to-ground lightning data and intra-cloud lightning data). In the framework proposed, these objects are the building blocks of a higher level object, the thunderstorm. The methodology is demonstrated with a dataset of thunderstorms whose main characteristics, along the complete life cycle of the convective structures (development, maturity and dissipation), are described statistically. The development and dissipation stages present similar durations in most cases examined. On the contrary, the duration of the maturity phase is much more variable and related to the thunderstorm intensity, defined here in terms of lightning flash rate. Most of the activity of IC and CG flashes is registered in the maturity stage. In the development stage little CG flashes are observed (2% to 5%), while for the dissipation phase is possible to observe a few more CG flashes (10% to 15%). Additionally, a selection of thunderstorms is used to examine general life cycle patterns, obtained from the analysis of normalized (with respect to thunderstorm total duration and maximum value of variables considered) thunderstorm parameters. Among other findings, the study indicates that the normalized duration of the three stages of thunderstorm life cycle is similar in most thunderstorms, with the longest duration corresponding to the maturity stage (approximately 80% of the total time).
Resumo:
A novel and simple procedure for concentrating adenoviruses from seawater samples is described. The technique entails the adsorption of viruses to pre-flocculated skimmed milk proteins, allowing the flocs to sediment by gravity, and dissolving the separated sediment in phosphate buffer. Concentrated virus may be detected by PCR techniques following nucleic acid extraction. The method requires no specialized equipment other than that usually available in routine public health laboratories, and due to its straightforwardness it allows the processing of a larger number of water samples simultaneously. The usefulness of the method was demonstrated in concentration of virus in multiple seawater samples during a survey of adenoviruses in coastal waters.
Resumo:
The following paper introduces a new approach to the analysis of offensive game in football. Therefore, the main aim of this study was to create an instrument for collecting information for the analysis of offensive action and interactions game. The observation instrument that was used to accomplish the main objective of this work consists of a combination of format fields (FC) and systems of categories (SC). This methodology is a particular strategy of the scientific method that has as an objective to analyse the perceptible behaviour that occurs in habitual contexts, allowing them to be formally recorded and quantified and using an ad hoc instrument in order to obtain a behaviour systematic registration that, since they have been transformed in quantitative data with the necessary reliability and validity determined level, will allow analysis of the relations between these behaviours. The codifications undertaken to date in various games of football have shown that it serves the purposes for which it was developed, allowing more research into the offensive game methods in football.
Resumo:
This paper examines statistical analysis of social reciprocity at group, dyadic, and individual levels. Given that testing statistical hypotheses regarding social reciprocity can be also of interest, a statistical procedure based on Monte Carlo sampling has been developed and implemented in R in order to allow social researchers to describe groups and make statistical decisions.