919 resultados para ON-LINE ANALYTICAL PROCESSING (OLAP)


Relevância:

100.00% 100.00%

Publicador:

Resumo:

This paper deals with transient stability analysis based on time domain simulation on vector processing. This approach requires the solution of a set of differential equations in conjunction of another set of algebraic equations. The solution of the algebraic equations has presented a scalar as sequential set of tasks, and the solution of these equations, on vector computers, has required much more investigations to speedup the simulations. Therefore, the main objective of this paper has been to present methods to solve the algebraic equations using vector processing. The results, using a GRAY computer, have shown that on-line transient stability assessment is feasible.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The enzyme purine nucleoside phosphorylase (PNP) is a target for the discovery of new lead compounds employed on the treatment severe T-cell mediated disorders. Within this context, the development of new, direct, and reliable methods for ligands screening is an important task. This paper describes the preparation of fused silica capillaries human PNP (HsPNP) immobilized enzyme reactor (IMER). The activity of the obtained IMER is monitored on line in a multidimensional liquid chromatography system, by the quantification of the product formed throughout the enzymatic reaction. The Km value for the immobilized enzyme was about twofold higher than that measured for the enzyme in solution (255 +/- 29.2 mu M and 133 +/- 114.9 mu M, respectively). A new fourth-generation immucillin derivative (DI4G: IC50 = 40.6 +/- 0.36 nM), previously identified and characterized in HsPNP free enzyme assays, was used to validate the IMER as a screening method for HsPNP ligands. The validated method was also used for mechanistic studies with this inhibitor. This new approach is a valuable tool to PNP ligand screening, since it directly measures the hypoxanthine released by inosine phosphorolysis, thus furnishing more reliable results than those one used in a coupled enzymatic spectrophotometric assay. (C) 2011 Elsevier B.V. All rights reserved.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Proton nuclear magnetic resonance (H-1 NMR) spectroscopy for detection of biochemical changes in biological samples is a successful technique. However, the achieved NMR resolution is not sufficiently high when the analysis is performed with intact cells. To improve spectral resolution, high resolution magic angle spinning (HR-MAS) is used and the broad signals are separated by a T-2 filter based on the CPMG pulse sequence. Additionally, HR-MAS experiments with a T-2 filter are preceded by a water suppression procedure. The goal of this work is to demonstrate that the experimental procedures of water suppression and T-2 or diffusing filters are unnecessary steps when the filter diagonalization method (FDM) is used to process the time domain HR-MAS signals. Manipulation of the FDM results, represented as a tabular list of peak positions, widths, amplitudes and phases, allows the removal of water signals without the disturbing overlapping or nearby signals. Additionally, the FDM can also be used for phase correction and noise suppression, and to discriminate between sharp and broad lines. Results demonstrate the applicability of the FDM post-acquisition processing to obtain high quality HR-MAS spectra of heterogeneous biological materials.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

This thesis is based on five papers addressing variance reduction in different ways. The papers have in common that they all present new numerical methods. Paper I investigates quantitative structure-retention relationships from an image processing perspective, using an artificial neural network to preprocess three-dimensional structural descriptions of the studied steroid molecules. Paper II presents a new method for computing free energies. Free energy is the quantity that determines chemical equilibria and partition coefficients. The proposed method may be used for estimating, e.g., chromatographic retention without performing experiments. Two papers (III and IV) deal with correcting deviations from bilinearity by so-called peak alignment. Bilinearity is a theoretical assumption about the distribution of instrumental data that is often violated by measured data. Deviations from bilinearity lead to increased variance, both in the data and in inferences from the data, unless invariance to the deviations is built into the model, e.g., by the use of the method proposed in paper III and extended in paper IV. Paper V addresses a generic problem in classification; namely, how to measure the goodness of different data representations, so that the best classifier may be constructed. Variance reduction is one of the pillars on which analytical chemistry rests. This thesis considers two aspects on variance reduction: before and after experiments are performed. Before experimenting, theoretical predictions of experimental outcomes may be used to direct which experiments to perform, and how to perform them (papers I and II). After experiments are performed, the variance of inferences from the measured data are affected by the method of data analysis (papers III-V).

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Nano(bio)science and nano(bio)technology play a growing and tremendous interest both on academic and industrial aspects. They are undergoing rapid developments on many fronts such as genomics, proteomics, system biology, and medical applications. However, the lack of characterization tools for nano(bio)systems is currently considered as a major limiting factor to the final establishment of nano(bio)technologies. Flow Field-Flow Fractionation (FlFFF) is a separation technique that is definitely emerging in the bioanalytical field, and the number of applications on nano(bio)analytes such as high molar-mass proteins and protein complexes, sub-cellular units, viruses, and functionalized nanoparticles is constantly increasing. This can be ascribed to the intrinsic advantages of FlFFF for the separation of nano(bio)analytes. FlFFF is ideally suited to separate particles over a broad size range (1 nm-1 μm) according to their hydrodynamic radius (rh). The fractionation is carried out in an empty channel by a flow stream of a mobile phase of any composition. For these reasons, fractionation is developed without surface interaction of the analyte with packing or gel media, and there is no stationary phase able to induce mechanical or shear stress on nanosized analytes, which are for these reasons kept in their native state. Characterization of nano(bio)analytes is made possible after fractionation by interfacing the FlFFF system with detection techniques for morphological, optical or mass characterization. For instance, FlFFF coupling with multi-angle light scattering (MALS) detection allows for absolute molecular weight and size determination, and mass spectrometry has made FlFFF enter the field of proteomics. Potentialities of FlFFF couplings with multi-detection systems are discussed in the first section of this dissertation. The second and the third sections are dedicated to new methods that have been developed for the analysis and characterization of different samples of interest in the fields of diagnostics, pharmaceutics, and nanomedicine. The second section focuses on biological samples such as protein complexes and protein aggregates. In particular it focuses on FlFFF methods developed to give new insights into: a) chemical composition and morphological features of blood serum lipoprotein classes, b) time-dependent aggregation pattern of the amyloid protein Aβ1-42, and c) aggregation state of antibody therapeutics in their formulation buffers. The third section is dedicated to the analysis and characterization of structured nanoparticles designed for nanomedicine applications. The discussed results indicate that FlFFF with on-line MALS and fluorescence detection (FD) may become the unparallel methodology for the analysis and characterization of new, structured, fluorescent nanomaterials.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

For the production of the orthophoto map Vernagtferner 1979, scale I: 10000, photographs of the flight "Hintereisferner 1979" were used, which have been found to be very suitable for differential rectification. Control points were determined before the flight took place. The processing of nine stereopairs was carried out on an analytical plotter. Simultaneously with the on-line plotting of the contour lines the reference data for the computation of the profiles for the differential rectification were recorded. The orthophoto map was covered by four aerial photographs. A smooth data transfer was ensured because the same computer was used for the data acquisition and the differential rectification. Two printing originals were prepared, one for the outline drawings with contour lines and another for the orthophoto. The print was done in black for the two copies. The data acquisition, the computation of the scanning profiles for the othoprojector and the procedure of the differential rectification are described. The reason for the use of on-line drawn contour lines is explained. Further applications, also for digital contour lines, are introduced. Possibilities for the achievement of high photo quality during the reproduction are discussed.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

During sentence processing there is a preference to treat the first noun phrase found as the subject and agent, unless marked the other way. This preference would lead to a conflict in thematic role assignment when the syntactic structure conforms to a non-canonical object-before-subject pattern. Left perisylvian and fronto-parietal brain networks have been found to be engaged by increased computational demands during sentence comprehension, while event-reated brain potentials have been used to study the on-line manifestation of these demands. However, evidence regarding the spatiotemporal organization of brain networks in this domain is scarce. In the current study we used Magnetoencephalography to track spatio-temporally brain activity while Spanish speakers were reading subject- and object-first cleft sentences. Both kinds of sentences remained ambiguous between a subject-first or an object-first interpretation up to the appearance of the second argument. Results show the time-modulation of a frontal network at the disambiguation point of object-first sentences. Moreover, the time windows where these effects took place have been previously related to thematic role integration (300–500 ms) and to sentence reanalysis and resolution of conflicts during processing (beyond 500 ms post-stimulus). These results point to frontal cognitive control as a putative key mechanism which may operate when a revision of the sentence structure and meaning is necessary

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The purpose of this research was to investigate the effects of Processing Instruction (VanPatten, 1996, 2007), as an input-based model for teaching second language grammar, on Syrian learners’ processing abilities. The present research investigated the effects of Processing Instruction on the acquisition of English relative clauses by Syrian learners in the form of a quasi-experimental design. Three separate groups were involved in the research (Processing Instruction, Traditional Instruction and a Control Group). For assessment, a pre-test, a direct post-test and a delayed post-test were used as main tools for eliciting data. A questionnaire was also distributed to participants in the Processing Instruction group to give them the opportunity to give feedback in relation to the treatment they received in comparison with the Traditional Instruction they are used to. Four hypotheses were formulated on the possible effectivity of Processing Instruction on Syrian learners’ linguistic system. It was hypothesised that Processing Instruction would improve learners’ processing abilities leading to an improvement in learners’ linguistic system. This was expected to lead to a better performance when it comes to the comprehension and production of English relative clauses. The main source of data was analysed statistically using the ANOVA test. Cohen’s d calculations were also used to support the ANOVA test. Cohen’s d showed the magnitude of effects of the three treatments. Results of the analysis showed that both Processing Instruction and Traditional Instruction groups had improved after treatment. However, the Processing Instruction Group significantly outperformed the other two groups in the comprehension of relative clauses. The analysis concluded that Processing Instruction is a useful tool for instructing relative clauses to Syrian learners. This was enhanced by participants’ responses to the questionnaire as they were in favour of Processing Instruction, rather than Traditional Instruction. This research has theoretical and pedagogical implications. Theoretically, the study showed support for the Input hypothesis. That is, it was shown that Processing Instruction had a positive effect on input processing as it affected learners’ linguistic system. This was reflected in learners’ performance where learners were able to produce a structure which they had not been asked to produce. Pedagogically, the present research showed that Processing Instruction is a useful tool for teaching English grammar in the context where the experiment was carried out, as it had a large effect on learners’ performance.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Bayesian algorithms pose a limit to the performance learning algorithms can achieve. Natural selection should guide the evolution of information processing systems towards those limits. What can we learn from this evolution and what properties do the intermediate stages have? While this question is too general to permit any answer, progress can be made by restricting the class of information processing systems under study. We present analytical and numerical results for the evolution of on-line algorithms for learning from examples for neural network classifiers, which might include or not a hidden layer. The analytical results are obtained by solving a variational problem to determine the learning algorithm that leads to maximum generalization ability. Simulations using evolutionary programming, for programs that implement learning algorithms, confirm and expand the results. The principal result is not just that the evolution is towards a Bayesian limit. Indeed it is essentially reached. In addition we find that evolution is driven by the discovery of useful structures or combinations of variables and operators. In different runs the temporal order of the discovery of such combinations is unique. The main result is that combinations that signal the surprise brought by an example arise always before combinations that serve to gauge the performance of the learning algorithm. This latter structures can be used to implement annealing schedules. The temporal ordering can be understood analytically as well by doing the functional optimization in restricted functional spaces. We also show that there is data suggesting that the appearance of these traits also follows the same temporal ordering in biological systems. © 2006 American Institute of Physics.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

* Supported partially by the Bulgarian National Science Fund under Grant MM-1405/2004

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Тихомир Трифонов, Цветанка Георгиева-Трифонова - В настоящата статия е представена системата bgBell/OLAP за складиране и онлайн аналитична обработка на данни за уникални български камбани. Реализираната система предоставя възможност за извеждане на обобщени справки и анализиране на различни характеристики на камбаните, за да се извлече предварително неизвестна и потенциално полезна информация.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Tall buildings are wind-sensitive structures and could experience high wind-induced effects. Aerodynamic boundary layer wind tunnel testing has been the most commonly used method for estimating wind effects on tall buildings. Design wind effects on tall buildings are estimated through analytical processing of the data obtained from aerodynamic wind tunnel tests. Even though it is widely agreed that the data obtained from wind tunnel testing is fairly reliable the post-test analytical procedures are still argued to have remarkable uncertainties. This research work attempted to assess the uncertainties occurring at different stages of the post-test analytical procedures in detail and suggest improved techniques for reducing the uncertainties. Results of the study showed that traditionally used simplifying approximations, particularly in the frequency domain approach, could cause significant uncertainties in estimating aerodynamic wind-induced responses. Based on identified shortcomings, a more accurate dual aerodynamic data analysis framework which works in the frequency and time domains was developed. The comprehensive analysis framework allows estimating modal, resultant and peak values of various wind-induced responses of a tall building more accurately. Estimating design wind effects on tall buildings also requires synthesizing the wind tunnel data with local climatological data of the study site. A novel copula based approach was developed for accurately synthesizing aerodynamic and climatological data up on investigating the causes of significant uncertainties in currently used synthesizing techniques. Improvement of the new approach over the existing techniques was also illustrated with a case study on a 50 story building. At last, a practical dynamic optimization approach was suggested for tuning structural properties of tall buildings towards attaining optimum performance against wind loads with less number of design iterations.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Advertising investment and audience figures indicate that television continues to lead as a mass advertising medium. However, its effectiveness is questioned due to problems such as zapping, saturation and audience fragmentation. This has favoured the development of non-conventional advertising formats. This study provides empirical evidence for the theoretical development. This investigation analyzes the recall generated by four non-conventional advertising formats in a real environment: short programme (branded content), television sponsorship, internal and external telepromotion versus the more conventional spot. The methodology employed has integrated secondary data with primary data from computer assisted telephone interviewing (CATI) were performed ad-hoc on a sample of 2000 individuals, aged 16 to 65, representative of the total television audience. Our findings show that non-conventional advertising formats are more effective at a cognitive level, as they generate higher levels of both unaided and aided recall, in all analyzed formats when compared to the spot.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

In the past decade, several major food safety crises originated from problems with feed. Consequently, there is an urgent need for early detection of fraudulent adulteration and contamination in the feed chain. Strategies are presented for two specific cases, viz. adulterations of (i) soybean meal with melamine and other types of adulterants/contaminants and (ii) vegetable oils with mineral oil, transformer oil or other oils. These strategies comprise screening at the feed mill or port of entry with non-destructive spectroscopic methods (NIRS and Raman), followed by post-screening and confirmation in the laboratory with MS-based methods. The spectroscopic techniques are suitable for on-site and on-line applications. Currently they are suited to detect fraudulent adulteration at relatively high levels but not to detect low level contamination. The potential use of the strategies for non-targeted analysis is demonstrated.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

DANTAS, Rodrigo Assis Neves; NÓBREGA, Walkíria Gomes da; MORAIS FILHO, Luiz Alves; MACÊDO, Eurides Araújo Bezerra de ; FONSECA , Patrícia de Cássia Bezerra; ENDERS, Bertha Cruz; MENEZES, Rejane Maria Paiva de; TORRES , Gilson de Vasconcelos. Paradigms in health care and its relationship to the nursing theories: an analytical test . Revista de Enfermagem UFPE on line. v.4,n.2, p.16-24.abr/jun. 2010. Disponível em < http://www.ufpe.br/revistaenfermagem/index.php/revista>.