990 resultados para Down-sample algorithm
Resumo:
The understanding of false belief is one of the most important milestones in the development of social cognition in children. Many studies have been conducted on this kind of cognition in children with a typical development. Despite being a key point for improving their welfare and quality of life, there are few studies in children with Down's syndrome. The aim of the present work is to carry out an in-depth study of social cognition in children with Down's syndrome. For this purpose, we used 6 tasks, with 3 levels of difficulty, in a group of 9 children aged between 4 and 14 years. Six of these children had a genetic diagnosis of Down's syndrome. The results of our research corroborate previous studies suggesting difficulties in the development of social cognition in children with Down's syndrome, and more specifically in tasks involving false beliefs
Resumo:
Monitoring of sewage sludge has proved the presence of many polar anthropogenic pollutants since LC/MS techniques came into routine use. While advanced techniques may improve characterizations, flawed sample processing procedures, however, may disturb or disguise the presence and fate of many target compounds present in this type of complex matrix before analytical process starts. Freeze-drying or oven-drying, in combination with centrifugation or filtration as sample processing techniques were performed followed by visual pattern recognition of target compounds for assessment of pretreatment processes. The results shown that oven-drying affected the sludge characterization, while freeze-drying led to less analytical misinterpretations.
Resumo:
Raman imaging spectroscopy is a highly useful analytical tool that provides spatial and spectral information on a sample. However, CCD detectors used in dispersive instruments present the drawback of being sensitive to cosmic rays, giving rise to spikes in Raman spectra. Spikes influence variance structures and must be removed prior to the use of multivariate techniques. A new algorithm for correction of spikes in Raman imaging was developed using an approach based on comparison of nearest neighbor pixels. The algorithm showed characteristics including simplicity, rapidity, selectivity and high quality in spike removal from hyperspectral images.
Resumo:
QSAR modeling is a novel computer program developed to generate and validate QSAR or QSPR (quantitative structure- activity or property relationships) models. With QSAR modeling, users can build partial least squares (PLS) regression models, perform variable selection with the ordered predictors selection (OPS) algorithm, and validate models by using y-randomization and leave-N-out cross validation. An additional new feature is outlier detection carried out by simultaneous comparison of sample leverage with the respective Studentized residuals. The program was developed using Java version 6, and runs on any operating system that supports Java Runtime Environment version 6. The use of the program is illustrated. This program is available for download at lqta.iqm.unicamp.br.
Resumo:
A direct, extraction-free spectrophotometric method has been developed for the determination of acebutolol hydrochloride (ABH) in pharmaceutical preparations. The method is based on ion-pair complex formation between the drug and two acidic dyes (sulphonaphthalein) namely bromocresol green (BCG) and bromothymol blue (BTB). Conformity to Beer's law enabled the assay of the drug in the range of 0.5-13.8 µg mL-1 with BCG and 1.8-15.9 µg mL-1 with BTB. Compared with a reference method, the results obtained were of equal accuracy and precision. In addition, these methods were also found to be specific for the analysis of acebutolol hydrochloride in the presence of excipients, which are co-formulated in the drug.
Resumo:
The human genome comprises roughly 20 000 protein coding genes. Proteins are the building material for cells and tissues, and proteins are functional compounds having an important role in many cellular responses, such as cell signalling. In multicellular organisms such as humans, cells need to communicate with each other in order to maintain a normal function of the tissues within the body. This complex signalling between and within cells is transferred by proteins and their post-translational modifications, one of the most important being phosphorylation. The work presented here concerns the development and use of tools for phosphorylation analysis. Mass spectrometers have become essential tools to study proteins and proteomes. In mass spectrometry oriented proteomics, proteins can be identified and their post-translational modifications can be studied. In this Ph.D. thesis the objectives were to improve the robustness of sample handling methods prior to mass spectrometry analysis for peptides and their phosphorylation status. The focus was to develop strategies that enable acquisition of more MS measurements per sample, higher quality MS spectra and simplified and rapid enrichment procedures for phosphopeptides. Furthermore, an objective was to apply these methods to characterize phosphorylation sites of phosphopeptides. In these studies a new MALDI matrix was developed which allowed more homogenous, intense and durable signals to be acquired when compared to traditional CHCA matrix. This new matrix along with other matrices was subsequently used to develop a new method that combines multiple spectra from different matrises from identical peptides. With this approach it was possible to identify more phosphopeptides than with conventional LC/ESI-MS/MS methods, and to use 5 times less sample. Also, phosphopeptide affinity MALDI target was prepared to capture and immobilise phosphopeptides from a standard peptide mixture while maintaining their spatial orientation. In addition a new protocol utilizing commercially available conductive glass slides was developed that enabled fast and sensitive phosphopeptide purification. This protocol was applied to characterize the in vivo phosphorylation of a signalling protein, NFATc1. Evidence for 12 phosphorylation sites were found, and many of those were found in multiply phosphorylated peptides
Resumo:
The goal of this work was to develop a simple and rapid preparation method for patulin analysis in apple juice without previous clean-up. This method combined sonication and liquid extraction techniques and was used for determination of patulin in 37 commercial apple juices available on the market in the South of Brazil. The method performance characteristics were determined using a sample obtained in a local market fortified at five concentration levels of patulin and done in triplicates. The coefficient of variation for repeatability at the fortification level of 20.70µg.L-1 was 3.53 % and the recovery 94.63 %, respectively. The correlation coefficient was 0.9996 and agrees with the requirements for a linear analytical method value. The detection limit was 0.21µg.L-1 and the quantification limit 0.70 µg.L-1. Only three of the analyzed samples were upper the allowed level of 50.00 µg.L-1 recommended for the World Health Organization.
Resumo:
In the Russian Wholesale Market, electricity and capacity are traded separately. Capacity is a special good, the sale of which obliges suppliers to keep their generating equipment ready to produce the quantity of electricity indicated by the System Operator. The purpose of the formation of capacity trading was the maintenance of reliable and uninterrupted delivery of electricity in the wholesale market. The price of capacity reflects constant investments in construction, modernization and maintenance of power plants. So, the capacity sale creates favorable conditions to attract investments in the energy sector because it guarantees the investor that his investments will be returned.
Resumo:
OBJECTIVES: 1 - Verify the prevalence of depressive symptoms in first to fourth-year medical students using the Beck Depression Inventory (BDI). 2 - Establish correlations between target factors and higher or lower BDI scores. 3 - Investigate the relationship between the prevalence of depressive symptoms and the demand for psychological care offered by the Centro Universitário Lusíada. METHOD: Cross-sectional study of 290 first to fourth-year medical students; implementation of the BDI, socio-demographic survey, and evaluation of satisfaction with progress. RESULTS: The study sample was 59% female and 41% male. Mean BDI was 6.3 (SD 5.8). Overall prevalence of depressive symptoms was 23.1%. The following associations were statistically significant (p<0.05): among students for whom the course failed to meet original expectations, who were dissatisfied with the course, or who came from the interior of the State (20.5%, 12.5%, and 24.4% of the total sample, respectively), for 40%, 36.1% and 36.4%, respectively, the BDI was consistent with some degree of depression. CONCLUSION: The study showed that there is higher prevalence of depressive symptoms in medical students than in the general population
Resumo:
In this work a fuzzy linear system is used to solve Leontief input-output model with fuzzy entries. For solving this model, we assume that the consumption matrix from di erent sectors of the economy and demand are known. These assumptions heavily depend on the information obtained from the industries. Hence uncertainties are involved in this information. The aim of this work is to model these uncertainties and to address them by fuzzy entries such as fuzzy numbers and LR-type fuzzy numbers (triangular and trapezoidal). Fuzzy linear system has been developed using fuzzy data and it is solved using Gauss-Seidel algorithm. Numerical examples show the e ciency of this algorithm. The famous example from Prof. Leontief, where he solved the production levels for U.S. economy in 1958, is also further analyzed.
Resumo:
Our understanding of the pathogenesis of organ‐specific autoinflammation has been restricted by limited access to the target organs. Peripheral blood, however, as a preferred transportation route for immune cells, provides a window to assess the entire immune system throughout the body. Transcriptional profiling with RNA stabilizing blood collection tubes reflects in vivo expression profiles at the time the blood is drawn, allowing detection of the disease activity in different samples or within the same sample over time. The main objective of this Ph.D. study was to apply gene‐expression microarrays in the characterization of peripheral blood transcriptional profiles in patients with autoimmune diseases. To achieve this goal a custom cDNA microarray targeted for gene‐expression profiling of human immune system was designed and produced. Sample collection and preparation was then optimized to allow gene‐expression profiling from whole‐blood samples. To overcome challenges resulting from minute amounts of sample material, RNA amplification was successfully applied to study pregnancy related immunosuppression in patients with multiple sclerosis (MS). Furthermore, similar sample preparation was applied to characterize longitudinal genome‐wide expression profiles in children with type 1 diabetes (T1D) associated autoantibodies and eventually clinical T1D. Blood transcriptome analyses, using both the ImmunoChip cDNA microarray with targeted probe selection and genome‐wide Affymetrix U133 Plus 2.0 oligonucleotide array, enabled monitoring of autoimmune activity. Novel disease related genes and general autoimmune signatures were identified. Notably, down‐regulation of the HLA class Ib molecules in peripheral blood was associated with disease activity in both MS and T1D. Taken together, these studies demonstrate the potential of peripheral blood transcriptional profiling in biomedical research and diagnostics. Imbalances in peripheral blood transcriptional activity may reveal dynamic changes that are relevant for the disease but might be completely missed in conventional cross‐sectional studies.
Resumo:
I doktorsavhandlingen undersöks förmågan att lösa hos ett antal lösare för optimeringsproblem och ett antal svårigheter med att göra en rättvis lösarjämförelse avslöjas. Dessutom framläggs några förbättringar som utförts på en av lösarna som heter GAMS/AlphaECP. Optimering innebär, i det här sammanhanget, att finna den bästa möjliga lösningen på ett problem. Den undersökta klassen av problem kan karaktäriseras som svårlöst och förekommer inom ett flertal industriområden. Målet har varit att undersöka om det finns en lösare som är universellt snabbare och hittar lösningar med högre kvalitet än någon av de andra lösarna. Det kommersiella optimeringssystemet GAMS (General Algebraic Modeling System) och omfattande problembibliotek har använts för att jämföra lösare. Förbättringarna som presenterats har utförts på GAMS/AlphaECP lösaren som baserar sig på skärplansmetoden Extended Cutting Plane (ECP). ECP-metoden har utvecklats främst av professor Tapio Westerlund på Anläggnings- och systemteknik vid Åbo Akademi.
Resumo:
This book is dedicated to celebrate the 60th birthday of Professor Rainer Huopalahti. Professor Rainer “Repe” Huopalahti has had, and in fact is still enjoying a distinguished career in the analysis of food and food related flavor compounds. One will find it hard to make any progress in this particular field without a valid and innovative sample handling technique and this is a field in which Professor Huopalahti has made great contributions. The title and the front cover of this book honors Professor Huopahti’s early steps in science. His PhD thesis which was published on 1985 is entitled “Composition and content of aroma compounds in the dill herb, Anethum graveolens L., affected by different factors”. At that time, the thesis introduced new technology being applied to sample handling and analysis of flavoring compounds of dill. Sample handling is an essential task that in just about every analysis. If one is working with minor compounds in a sample or trying to detect trace levels of the analytes, one of the aims of sample handling may be to increase the sensitivity of the analytical method. On the other hand, if one is working with a challenging matrix such as the kind found in biological samples, one of the aims is to increase the selectivity. However, quite often the aim is to increase both the selectivity and the sensitivity. This book provides good and representative examples about the necessity of valid sample handling and the role of the sample handling in the analytical method. The contributors of the book are leading Finnish scientists on the field of organic instrumental analytical chemistry. Some of them are also Repe’ s personal friends and former students from the University of Turku, Department of Biochemistry and Food Chemistry. Importantly, the authors all know Repe in one way or another and are well aware of his achievements on the field of analytical chemistry. The editorial team had a great time during the planning phase and during the “hard work editorial phase” of the book. For example, we came up with many ideas on how to publish the book. After many long discussions, we decided to have a limited edition as an “old school hard cover book” – and to acknowledge more modern ways of disseminating knowledge by publishing an internet version of the book on the webpages of the University of Turku. Downloading the book from the webpage for personal use is free of charge. We believe and hope that the book will be read with great interest by scientists working in the fascinating field of organic instrumental analytical chemistry. We decided to publish our book in English for two main reasons. First, we believe that in the near future, more and more teaching in Finnish Universities will be delivered in English. To facilitate this process and encourage students to develop good language skills, it was decided to be published the book in English. Secondly, we believe that the book will also interest scientists outside Finland – particularly in the other member states of the European Union. The editorial team thanks all the authors for their willingness to contribute to this book – and to adhere to the very strict schedule. We also want to thank the various individuals and enterprises who financially supported the book project. Without that support, it would not have been possible to publish the hardcover book.
Resumo:
It is presented a software developed with Delphi programming language to compute the reservoir's annual regulated active storage, based on the sequent-peak algorithm. Mathematical models used for that purpose generally require extended hydrological series. Usually, the analysis of those series is performed with spreadsheets or graphical representations. Based on that, it was developed a software for calculation of reservoir active capacity. An example calculation is shown by 30-years (from 1977 to 2009) monthly mean flow historical data, from Corrente River, located at São Francisco River Basin, Brazil. As an additional tool, an interface was developed to manage water resources, helping to manipulate data and to point out information that it would be of interest to the user. Moreover, with that interface irrigation districts where water consumption is higher can be analyzed as a function of specific seasonal water demands situations. From a practical application, it is possible to conclude that the program provides the calculation originally proposed. It was designed to keep information organized and retrievable at any time, and to show simulation on seasonal water demands throughout the year, contributing with the elements of study concerning reservoir projects. This program, with its functionality, is an important tool for decision making in the water resources management.