1000 resultados para Quantitative electroencephalography


Relevância:

20.00% 20.00%

Publicador:

Resumo:

This paper presents the ideas underlying a program that takes as input a schematic of a mechanical or hydraulic power transmission system, plus specifications and a utility function, and returns catalog numbers from predefined catalogs for the optimal selection of components implementing the design. It thus provides the designer with a high level "language" in which to compose new designs, then performs some of the detailed design process for him. The program is based on a formalization of quantitative inferences about hierarchically organized sets of artifacts and operating conditions, which allows design compilation without the exhaustive enumeration of alternatives.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

This thesis investigates what knowledge is necessary to solve mechanics problems. A program NEWTON is described which understands and solves problems in mechanics mini-world of objects moving on surfaces. Facts and equations such as those given in mechanics text need to be represented. However, this is far from sufficient to solve problems. Human problem solvers rely on "common sense" and "qualitative" knowledge which the physics text tacitly assumes to be present. A mechanics problem solver must embody such knowledge. Quantitative knowledge given by equations and more qualitative common sense knowledge are the major research points exposited in this thesis. The major issue in solving problems is planning. Planning involves tentatively outlining a possible path to the solution without actually solving the problem. Such a plan needs to be constructed and debugged in the process of solving the problem. Envisionment, or qualitative simulation of the event, plays a central role in this planning process.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The extract of Adinandra nitida leaves, named as Shiyacha in China, was studied by high performance liquid chromatography (HPLC)-ultraviolet detection-electrospray ionisation (ESI) tandem mass spectrometry (MS). Under the optimized condition, the analysis could be finished in 45 min on a Hypersil C18 column combined with negative ion detection using information-dependent acquisition (IDA) mode of a Q TRAP (TM) instrument. Six flavonoids were identified as epicatechin, rhoifolin, apigenin, quercitrin, camellianin A, and camellianin B among which rhoifolin was for the first time found in Shiyacha. And the fragment pathways of these flavonoids were elucidated. Furthermore, with epicatechin, rhoifolin, and apigenin as markers, the quality control method for Shiyacha and its relevant product was firstly established. Calibration linearity was good (R-2 > 0.9992) over a three to four orders of magnitude concentration range with an S/N = 3 detection limit of 2.5 ng. (c) 2004 Elsevier B.V. All rights reserved.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

A quantitative analysis of the individual compounds in tobacco essential oils is performed by comprehensive two-dimensional gas chromatography (GC x GC) combined with flame ionization detector (FID). A time-of-flight mass spectrometer (TOF/MS) was coupled to GC x GC for the identification of the resolved peaks. The response of a flame ionization detector to different compound classes was calibrated using multiple internal standards. In total, 172 compounds were identified with good match and 61 compounds with high probability value were reliably quantified. For comparative purposes, the essential oil sample was also quantified by one-dimensional gas chromatography-mass spectrometry (GC/MS) with multiple internal standards method. The results showed that there was close agreement between the two analysis methods when the peak purity and match quality in one-dimensional GC/MS are high enough. (c) 2005 Elsevier B.V. All rights reserved.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

AIM: To evaluate the suitability of reference genes in gastric tissue samples and cell lines.METHODS: the suitability of genes ACTB, B2M, GAPDH, RPL29, and 18S rRNA was assessed in 21 matched pairs of neoplastic and adjacent nonneoplastic gastric tissues from patients with gastric adenocarcinoma, 27 normal gastric tissues from patients without cancer, and 4 cell lines using reverse transcription quantitative real-time polymerase chain reaction (RT-qPCR). the ranking of the best single and combination of reference genes was determined by NormFinder, geNorm (TM), BestKeeper, and DataAssist (TM). in addition, GenEx software was used to determine the optimal number of reference genes. To validate the results, the mRNA expression of a target gene, DNMT1, was quantified using the different reference gene combinations suggested by the various software packages for normalization.RESULTS: ACTB was the best reference gene for all gastric tissues, cell lines and all gastric tissues plus cell lines. GAPDH + B2M or ACTB + B2M was the best combination of reference genes for all the gastric tissues. On the other hand, ACTB + B2M was the best combination for all the cell lines tested and was also the best combination for analyses involving all the gastric tissues plus cell lines. According to the GenEx software, 2 or 3 genes were the optimal number of references genes for all the gastric tissues. the relative quantification of DNMT1 showed similar patterns when normalized by each combination of reference genes. the level of expression of DNMT1 in neoplastic, adjacent non-neoplastic and normal gastric tissues did not differ when these samples were normalized using GAPDH + B2M (P = 0.32), ACTB + B2M (P = 0.61), or GAPDH + B2M + ACTB (P = 0.44).CONCLUSION: GAPDH + B2M or ACTB + B2M is the best combination of reference gene for all the gastric tissues, and ACTB + B2M is the best combination for the cell lines tested. (C) 2013 Baishideng Publishing Group Co., Limited. All rights reserved.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

There are a variety of guidelines and methods available to measure and assess survey quality. Most of these are based on qualitative descriptions. In practice, they are not easy to implement and it is very difficult to make comparisons between surveys. Hence there is a theoretical and pragmatic demand to develop a mainly quantitative based survey assessment tool. This research aimed to meet this need and make contributions to the evaluation and improvement of survey quality. Acknowledging the critical importance of measurement issues in survey research, this thesis starts with a comprehensive introduction to measurement theory and identifies the types of measurement errors associated with measurement procedures through three experiments. Then it moves on to describe concepts, guidelines and methods available for measuring and assessing survey quality. Combining these with measurement principles leads to the development of a quantitative based statistical holistic tool to measure and assess survey quality. The criteria, weights and subweights for the assessment tool are determined using Multi-Criteria Decision-Making (MCDM) and a survey questionnaire based on the Delphi method. Finally the model is applied to a database of surveys which was constructed to develop methods of classification, assessment and improvement of survey quality. The model developed in this thesis enables survey researchers and/or commissioners to make a holistic assessment of the value of the particular survey(s). This model is an Excel based audit which takes a holistic approach, following all stages of the survey from inception, to design, construction, execution, analysis and dissemination. At each stage a set of criteria are applied to assess quality. Scores attained against these assessments are weighted by the importance of the criteria and summed to give an overall assessment of the stage. The total score for a survey can be obtained by a combination of the scores for every stage weighted again by the importance of each stage. The advantage of this is to construct a means of survey assessment which can be used in a diagnostic manner to assess and improve survey quality.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Ellis, D. I., Broadhurst, D., Kell, D. B., Rowland, J. J., Goodacre, R. (2002). Rapid and quantitative detection of the microbial spoilage of meat by Fourier Transform Infrared Spectroscopy and machine learning. ? Applied and Environmental Microbiology, 68, (6), 2822-2828 Sponsorship: BBSRC

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Cumbers, B., Urquhart, C. & Durbin, J. (2006). Evaluation of the KA24 (Knowledge Access 24) service for health and social care staff in London and the South-East of England. Part 1: Quantitative. Health Information and Libraries Journal, 23(2), 133-139 Sponsorship: KA24 - NHS Trusts, London

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Jenkins, Tudor; Hayton, D.J.; Bedson, T.R.; Palmer, R.E., (2001) 'Quantitative evaluation of electron beam writing in passivated gold nanoclusters', Applied Physics Letters (78) pp.1921-1923 RAE2008

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Background Single nucleotide polymorphisms (SNPs) have been used extensively in genetics and epidemiology studies. Traditionally, SNPs that did not pass the Hardy-Weinberg equilibrium (HWE) test were excluded from these analyses. Many investigators have addressed possible causes for departure from HWE, including genotyping errors, population admixture and segmental duplication. Recent large-scale surveys have revealed abundant structural variations in the human genome, including copy number variations (CNVs). This suggests that a significant number of SNPs must be within these regions, which may cause deviation from HWE. Results We performed a Bayesian analysis on the potential effect of copy number variation, segmental duplication and genotyping errors on the behavior of SNPs. Our results suggest that copy number variation is a major factor of HWE violation for SNPs with a small minor allele frequency, when the sample size is large and the genotyping error rate is 0~1%. Conclusions Our study provides the posterior probability that a SNP falls in a CNV or a segmental duplication, given the observed allele frequency of the SNP, sample size and the significance level of HWE testing.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The topic of this thesis is an acoustic scattering technique for detennining the compressibility and density of individual particles. The particles, which have diameters on the order of 10 µm, are modeled as fluid spheres. Ultrasonic tone bursts of 2 µsec duration and 30 MHz center frequency scatter from individual particles as they traverse the focal region of two confocally positioned transducers. One transducer acts as a receiver while the other both transmits and receives acoustic signals. The resulting scattered bursts are detected at 90° and at 180° (backscattered). Using either the long wavelength (Rayleigh) or the weak scatterer (Born) approximations, it is possible to detennine the compressibility and density of the particle provided we possess a priori knowledge of the particle size and the host properties. The detected scattered signals are digitized and stored in computer memory. With this information we can compute the mean compressibility and density averaged over a population of particles ( typically 1000 particles) or display histograms of scattered amplitude statistics. An experiment was run first run to assess the feasibility of using polystyrene polymer microspheres to calibrate the instrument. A second study was performed on the buffy coat harvested from whole human blood. Finally, chinese hamster ovary cells which were subject to hyperthermia treatment were studied in order to see if the instrument could detect heat induced membrane blebbing.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Neoplastic tissue is typically highly vascularized, contains abnormal concentrations of extracellular proteins (e.g. collagen, proteoglycans) and has a high interstitial fluid pres- sure compared to most normal tissues. These changes result in an overall stiffening typical of most solid tumors. Elasticity Imaging (EI) is a technique which uses imaging systems to measure relative tissue deformation and thus noninvasively infer its mechanical stiffness. Stiffness is recovered from measured deformation by using an appropriate mathematical model and solving an inverse problem. The integration of EI with existing imaging modal- ities can improve their diagnostic and research capabilities. The aim of this work is to develop and evaluate techniques to image and quantify the mechanical properties of soft tissues in three dimensions (3D). To that end, this thesis presents and validates a method by which three dimensional ultrasound images can be used to image and quantify the shear modulus distribution of tissue mimicking phantoms. This work is presented to motivate and justify the use of this elasticity imaging technique in a clinical breast cancer screening study. The imaging methodologies discussed are intended to improve the specificity of mammography practices in general. During the development of these techniques, several issues concerning the accuracy and uniqueness of the result were elucidated. Two new algorithms for 3D EI are designed and characterized in this thesis. The first provides three dimensional motion estimates from ultrasound images of the deforming ma- terial. The novel features include finite element interpolation of the displacement field, inclusion of prior information and the ability to enforce physical constraints. The roles of regularization, mesh resolution and an incompressibility constraint on the accuracy of the measured deformation is quantified. The estimated signal to noise ratio of the measured displacement fields are approximately 1800, 21 and 41 for the axial, lateral and eleva- tional components, respectively. The second algorithm recovers the shear elastic modulus distribution of the deforming material by efficiently solving the three dimensional inverse problem as an optimization problem. This method utilizes finite element interpolations, the adjoint method to evaluate the gradient and a quasi-Newton BFGS method for optimiza- tion. Its novel features include the use of the adjoint method and TVD regularization with piece-wise constant interpolation. A source of non-uniqueness in this inverse problem is identified theoretically, demonstrated computationally, explained physically and overcome practically. Both algorithms were test on ultrasound data of independently characterized tissue mimicking phantoms. The recovered elastic modulus was in all cases within 35% of the reference elastic contrast. Finally, the preliminary application of these techniques to tomosynthesis images showed the feasiblity of imaging an elastic inclusion.