985 resultados para Synthetic methods
Resumo:
Current Physiologically based pharmacokinetic (PBPK) models are inductive. We present an additional, different approach that is based on the synthetic rather than the inductive approach to modeling and simulation. It relies on object-oriented programming A model of the referent system in its experimental context is synthesized by assembling objects that represent components such as molecules, cells, aspects of tissue architecture, catheters, etc. The single pass perfused rat liver has been well described in evaluating hepatic drug pharmacokinetics (PK) and is the system on which we focus. In silico experiments begin with administration of objects representing actual compounds. Data are collected in a manner analogous to that in the referent PK experiments. The synthetic modeling method allows for recognition and representation of discrete event and discrete time processes, as well as heterogeneity in organization, function, and spatial effects. An application is developed for sucrose and antipyrine, administered separately and together PBPK modeling has made extensive progress in characterizing abstracted PK properties but this has also been its limitation. Now, other important questions and possible extensions emerge. How are these PK properties and the observed behaviors generated? The inherent heuristic limitations of traditional models have hindered getting meaningful, detailed answers to such questions. Synthetic models of the type described here are specifically intended to help answer such questions. Analogous to wet-lab experimental models, they retain their applicability even when broken apart into sub-components. Having and applying this new class of models along with traditional PK modeling methods is expected to increase the productivity of pharmaceutical research at all levels that make use of modeling and simulation.
Resumo:
Background/Aims: Positron emission tomography has been applied to study cortical activation during human swallowing, but employs radio-isotopes precluding repeated experiments and has to be performed supine, making the task of swallowing difficult. Here we now describe Synthetic Aperture Magnetometry (SAM) as a novel method of localising and imaging the brain's neuronal activity from magnetoencephalographic (MEG) signals to study the cortical processing of human volitional swallowing in the more physiological prone position. Methods: In 3 healthy male volunteers (age 28–36), 151-channel whole cortex MEG (Omega-151, CTF Systems Inc.) was recorded whilst seated during the conditions of repeated volitional wet swallowing (5mls boluses at 0.2Hz) or rest. SAM analysis was then performed using varying spatial filters (5–60Hz) before co-registration with individual MRI brain images. Activation areas were then identified using standard sterotactic space neuro-anatomical maps. In one subject repeat studies were performed to confirm the initial study findings. Results: In all subjects, cortical activation maps for swallowing could be generated using SAM, the strongest activations being seen with 10–20Hz filter settings. The main cortical activations associated with swallowing were in: sensorimotor cortex (BA 3,4), insular cortex and lateral premotor cortex (BA 6,8). Of relevance, each cortical region displayed consistent inter-hemispheric asymmetry, to one or other hemisphere, this being different for each region and for each subject. Intra-subject comparisons of activation localisation and asymmetry showed impressive reproducibility. Conclusion: SAM analysis using MEG is an accurate, repeatable, and reproducible method for studying the brain processing of human swallowing in a more physiological manner and provides novel opportunities for future studies of the brain-gut axis in health and disease.
Resumo:
Modified oligonucleotides containing sulphur group have been useful tools for studies of carcinogenesis, protein or nucleic acid structures and functions, protein-nucleic acid interactions, and for antisense modulation of gene expression. One successful example has been the synthesis and study of oligodeoxynucleotides containing 6-thio-2'-deoxyguanine. 6-Thio-2-deoxyguanosine was first discovered as metabolic compound of 6- mercaptopurine (6-MP). Later, it was applied as drug to cure leukaemia. During the research of its toxicity, a method was developed to use the sulphur group as a versatile position for post-synthetic modification. The advantage of application of post-synthetic modification lies in its convenience. Synthesis of oligomers with normal sequences has become routine work in most laboratories. However, design and synthesis of a proper phosphoramidite monomer for a new modified nucleoside are always difficult tasks even for a skilful chemist. Thus an alternative method (post-synthetic method) has been invented to overcome the difficulties. This was achieved by incorporation of versatile nucleotides into oligomers which contain a leaving group, that is sufficiently stable to withstand the conditions of synthesis but can be substituted by nucleophiles after synthesis, to produce, a series of oligomers each containing a different modified base. In the current project, a phosphoramidite monomer with 6-thioguanine has been successfully synthesised and incorporated into RNA. A deprotection procedure, which is specific for RNA was designed for oligomers containing 6-thioguanosine. The results were validated by various methods (UV, HPLC, enzymatic digestion). Pioneer work in utilization of the versatile sulphur group for post-synthetic modification was also tested. Post-synthetic modification was also carried out on DNA with 6- deoxythioguanosine. Electrophilic reagents with various functional groups (alphatic, aromatic, fluorescent) and bi-functional groups have been attached with the oligomers.
Resumo:
This thesis is an exploration of the organisation and functioning of the human visual system using the non-invasive functional imaging modality magnetoencephalography (MEG). Chapters one and two provide an introduction to the ‘human visual system and magnetoencephalographic methodologies. These chapters subsequently describe the methods by which MEG can be used to measure neuronal activity from the visual cortex. Chapter three describes the development and implementation of novel analytical tools; including beamforming based analyses, spectrographic movies and an optimisation of group imaging methods. Chapter four focuses on the use of established and contemporary analytical tools in the investigation of visual function. This is initiated with an investigation of visually evoked and induced responses; covering visual evoked potentials (VEPs) and event related synchronisation/desynchronisation (ERS/ERD). Chapter five describes the employment of novel methods in the investigation of cortical contrast response and demonstrates distinct contrast response functions in striate and extra-striate regions of visual cortex. Chapter six use synthetic aperture magnetometry (SAM) to investigate the phenomena of visual cortical gamma oscillations in response to various visual stimuli; concluding that pattern is central to its generation and that it increases in amplitude linearly as a function of stimulus contrast, consistent with results from invasive electrode studies in the macaque monkey. Chapter seven describes the use of driven visual stimuli and tuned SAM methods in a pilot study of retinotopic mapping using MEG; finding that activity in the primary visual cortex can be distinguished in four quadrants and two eccentricities of the visual field. Chapter eight is a novel implementation of the SAM beamforming method in the investigation of a subject with migraine visual aura; the method reveals desynchronisation of the alpha and gamma frequency bands in occipital and temporal regions contralateral to observed visual abnormalities. The final chapter is a summary of main conclusions and suggested further work.
Resumo:
The major challenge of MEG, the inverse problem, is to estimate the very weak primary neuronal currents from the measurements of extracranial magnetic fields. The non-uniqueness of this inverse solution is compounded by the fact that MEG signals contain large environmental and physiological noise that further complicates the problem. In this paper, we evaluate the effectiveness of magnetic noise cancellation by synthetic gradiometers and the beamformer analysis method of synthetic aperture magnetometry (SAM) for source localisation in the presence of large stimulus-generated noise. We demonstrate that activation of primary somatosensory cortex can be accurately identified using SAM despite the presence of significant stimulus-related magnetic interference. This interference was generated by a contact heat evoked potential stimulator (CHEPS), recently developed for thermal pain research, but which to date has not been used in a MEG environment. We also show that in a reduced shielding environment the use of higher order synthetic gradiometry is sufficient to obtain signal-to-noise ratios (SNRs) that allow for accurate localisation of cortical sensory function.
Resumo:
This work presents a two-dimensional approach of risk assessment method based on the quantification of the probability of the occurrence of contaminant source terms, as well as the assessment of the resultant impacts. The risk is calculated using Monte Carlo simulation methods whereby synthetic contaminant source terms were generated to the same distribution as historically occurring pollution events or a priori potential probability distribution. The spatial and temporal distributions of the generated contaminant concentrations at pre-defined monitoring points within the aquifer were then simulated from repeated realisations using integrated mathematical models. The number of times when user defined ranges of concentration magnitudes were exceeded is quantified as risk. The utilities of the method were demonstrated using hypothetical scenarios, and the risk of pollution from a number of sources all occurring by chance together was evaluated. The results are presented in the form of charts and spatial maps. The generated risk maps show the risk of pollution at each observation borehole, as well as the trends within the study area. This capability to generate synthetic pollution events from numerous potential sources of pollution based on historical frequency of their occurrence proved to be a great asset to the method, and a large benefit over the contemporary methods.
Resumo:
Removing noise from piecewise constant (PWC) signals is a challenging signal processing problem arising in many practical contexts. For example, in exploration geosciences, noisy drill hole records need to be separated into stratigraphic zones, and in biophysics, jumps between molecular dwell states have to be extracted from noisy fluorescence microscopy signals. Many PWC denoising methods exist, including total variation regularization, mean shift clustering, stepwise jump placement, running medians, convex clustering shrinkage and bilateral filtering; conventional linear signal processing methods are fundamentally unsuited. This paper (part I, the first of two) shows that most of these methods are associated with a special case of a generalized functional, minimized to achieve PWC denoising. The minimizer can be obtained by diverse solver algorithms, including stepwise jump placement, convex programming, finite differences, iterated running medians, least angle regression, regularization path following and coordinate descent. In the second paper, part II, we introduce novel PWC denoising methods, and comparisons between these methods performed on synthetic and real signals, showing that the new understanding of the problem gained in part I leads to new methods that have a useful role to play.
Resumo:
Removing noise from signals which are piecewise constant (PWC) is a challenging signal processing problem that arises in many practical scientific and engineering contexts. In the first paper (part I) of this series of two, we presented background theory building on results from the image processing community to show that the majority of these algorithms, and more proposed in the wider literature, are each associated with a special case of a generalized functional, that, when minimized, solves the PWC denoising problem. It shows how the minimizer can be obtained by a range of computational solver algorithms. In this second paper (part II), using this understanding developed in part I, we introduce several novel PWC denoising methods, which, for example, combine the global behaviour of mean shift clustering with the local smoothing of total variation diffusion, and show example solver algorithms for these new methods. Comparisons between these methods are performed on synthetic and real signals, revealing that our new methods have a useful role to play. Finally, overlaps between the generalized methods of these two papers and others such as wavelet shrinkage, hidden Markov models, and piecewise smooth filtering are touched on.
Resumo:
Abstract
Continuous variable is one of the major data types collected by the survey organizations. It can be incomplete such that the data collectors need to fill in the missingness. Or, it can contain sensitive information which needs protection from re-identification. One of the approaches to protect continuous microdata is to sum them up according to different cells of features. In this thesis, I represents novel methods of multiple imputation (MI) that can be applied to impute missing values and synthesize confidential values for continuous and magnitude data.
The first method is for limiting the disclosure risk of the continuous microdata whose marginal sums are fixed. The motivation for developing such a method comes from the magnitude tables of non-negative integer values in economic surveys. I present approaches based on a mixture of Poisson distributions to describe the multivariate distribution so that the marginals of the synthetic data are guaranteed to sum to the original totals. At the same time, I present methods for assessing disclosure risks in releasing such synthetic magnitude microdata. The illustration on a survey of manufacturing establishments shows that the disclosure risks are low while the information loss is acceptable.
The second method is for releasing synthetic continuous micro data by a nonstandard MI method. Traditionally, MI fits a model on the confidential values and then generates multiple synthetic datasets from this model. Its disclosure risk tends to be high, especially when the original data contain extreme values. I present a nonstandard MI approach conditioned on the protective intervals. Its basic idea is to estimate the model parameters from these intervals rather than the confidential values. The encouraging results of simple simulation studies suggest the potential of this new approach in limiting the posterior disclosure risk.
The third method is for imputing missing values in continuous and categorical variables. It is extended from a hierarchically coupled mixture model with local dependence. However, the new method separates the variables into non-focused (e.g., almost-fully-observed) and focused (e.g., missing-a-lot) ones. The sub-model structure of focused variables is more complex than that of non-focused ones. At the same time, their cluster indicators are linked together by tensor factorization and the focused continuous variables depend locally on non-focused values. The model properties suggest that moving the strongly associated non-focused variables to the side of focused ones can help to improve estimation accuracy, which is examined by several simulation studies. And this method is applied to data from the American Community Survey.
Resumo:
A survey of primary schools in England found that girls outperform boys in English across all phases (Ofsted in Moving English forward. Ofsted, Manchester, 2012). The gender gap remains an on-going issue in England, especially for reading attainment. This paper presents evidence of gender differences in learning to read that emerged during the development of a reading scheme for 4- and 5-year-old children in which 372 children from Reception classes in sixteen schools participated in 12-month trials. There were three arms per trial: Intervention non-PD (non-phonically decodable text with mixed methods teaching); Intervention PD (phonically decodable text with mixed methods teaching); and a ‘business as usual’ control condition SP (synthetic phonics and decodable text). Assignment to Intervention condition was randomised. Standardised measures of word reading and comprehension were used. The research provides statistically significant evidence suggesting that boys learn more easily using a mix of whole-word and synthetic phonics approaches. In addition, the evidence indicates that boys learn to read more easily using the natural-style language of ‘real’ books including vocabulary which goes beyond their assumed decoding ability. At post-test, boys using the nonphonically decodable text with mixed methods (Intervention A) were 8 months ahead in reading comprehension compared to boys using a wholly synthetic phonics approach.
Resumo:
Thesis (Ph.D.)--University of Washington, 2016-08
Resumo:
The purpose of this paper is to survey and assess the state-of-the-art in automatic target recognition for synthetic aperture radar imagery (SAR-ATR). The aim is not to develop an exhaustive survey of the voluminous literature, but rather to capture in one place the various approaches for implementing the SAR-ATR system. This paper is meant to be as self-contained as possible, and it approaches the SAR-ATR problem from a holistic end-to-end perspective. A brief overview for the breadth of the SAR-ATR challenges is conducted. This is couched in terms of a single-channel SAR, and it is extendable to multi-channel SAR systems. Stages pertinent to the basic SAR-ATR system structure are defined, and the motivations of the requirements and constraints on the system constituents are addressed. For each stage in the SAR-ATR processing chain, a taxonomization methodology for surveying the numerous methods published in the open literature is proposed. Carefully selected works from the literature are presented under the taxa proposed. Novel comparisons, discussions, and comments are pinpointed throughout this paper. A two-fold benchmarking scheme for evaluating existing SAR-ATR systems and motivating new system designs is proposed. The scheme is applied to the works surveyed in this paper. Finally, a discussion is presented in which various interrelated issues, such as standard operating conditions, extended operating conditions, and target-model design, are addressed. This paper is a contribution toward fulfilling an objective of end-to-end SAR-ATR system design.
Resumo:
Synthetic cannabinoid receptor agonists or more commonly known as synthetic cannabinoids (SCs) were originally created to obtain the medicinal value of THC but they are an emerging social problem. SCs are mostly produced coated on herbal materials or in powder form and marketed under a variety of brand names, e.g. “Spice”, “K2”. Despite many SCs becoming controlled under drug legislation, many of them remain legal in some countries around the world. In Scotland, SCs are controlled under the Misuse of Drugs Act 1971 and Psychoactive Substances Act 2016 that only cover a few early SCs. In Saudi Arabia, even fewer are controlled. The picture of the SCs-problem in Scotland is vague due to insufficient prevalence data, particularly that using biological samples. Whilst there is evidence of increasing use of SCs throughout the world, in Saudi Arabia, there is currently no data regarding the use of products containing SCs among Saudi people. Several studies indicate that SCs may cause serious toxicity and impairment to health therefore it is important to understand the scale of use within society. A simple and sensitive method was developed for the simultaneous analysis of 10 parent SCs (JWH-018, JWH-073, JWH-250, JWH-200, AM-1248, UR-144, A-796260, AB-FUBINACA, 5F-AKB-48 and 5F-PB-22) in whole blood and 8 corresponding metabolites (JWH-018 4-OH pentyl, JWH-073 3-OH butyl, JWH-250 4-OH pentyl, AM-2201 4-OH pentyl, JWH-122 5-OH pentyl, JWH-210 5-OH pentyl, 5F-AKB-48 (N-4 OH pentyl), 5F-PB-22 3-carboxyindole)in urine using LLE and LC-MS/MS. The method was validated according to the standard practices for method validation in forensic toxicology (SWGTOX, May 2013). All analytes gave acceptable precision, linearity and recovery for analysing blood and urine samples. The method was applied to 1,496 biological samples, a mixture of whole blood and urine. Blood and/or urine samples were analysed from 114 patients presenting at Accident and Emergency in Glasgow Royal Infirmary, in spring 2014 and JuneDecember 2015. 5F-AKB-48, 5F-PB-22 and MDMB-CHMICA were detected in 9, 7 and 9 cases respectively. 904 urine samples from individuals admitted to/liberated from Scottish prisons over November 2013 were tested for the presence of SCs. 5F-AKB-48 (N-4 OH pentyl) was detected in 10 cases and 5F-PB-22 3-carboxyindole in 3 cases. Blood and urine samples from two post-mortem cases in Scotland with suspected ingestion of SCs were analysed. Both cases were confirmed positive for 5F-AKB-48. A total of 463 urine samples were collected from personnel who presented to the Security Forces Hospital in Ryiadh for workplace drug testing as a requirement for their job during July 2014. The results of the analysis found 2 samples to be positive for 5F-PB-22 3carboxyindole. A further study in Saudi Arabia using a questionnaire was carried out among 3 subpopulations: medical professionals, members of the public in and around smoking cafes and known drug users. With regards to general awareness of Spice products, 16%, 11% and 22% of those participants of medical professionals, members of the public in and around smoking cafes and known drug users, respectively, were aware of the existence of SCs or Spice products. The respondents had an overall average of 4.5% who had a friend who used these Spice products. It is clear from the results obtained in both blood and urine testing and surveys that SCs are being used in both Scotland and Saudi Arabia. The extent of their use is not clear and the data presented here is an initial look into their prevalence. Blood and urine findings suggest changing trends in SC use, moving away from JWH and AM SCs to the newer 5F-AKB-48, 5-F-PB-22 and MDMBCHMICA compounds worldwide. In both countries 5F-PB-22 was detected. These findings clarify how the SCs phenomenon is a worldwide problem and how the information of every country regarding what SCs are seized can help and is not specific for that country. The analytes included in the method were selected due to their apparent availability in both countries, however it is possible that some newer analytes have been used and these would not have been detected. For this reason it is important that methods for testing SCs are updated regularly and evolve with the ever-changing availability of these drugs worldwide. In addition, there is little published literature regarding the concentrations of these drugs found in blood and urine samples and this work goes some way towards understanding these.
Resumo:
Increasing in resolution of numerical weather prediction models has allowed more and more realistic forecasts of atmospheric parameters. Due to the growing variability into predicted fields the traditional verification methods are not always able to describe the model ability because they are based on a grid-point-by-grid-point matching between observation and prediction. Recently, new spatial verification methods have been developed with the aim of show the benefit associated to the high resolution forecast. Nested in among of the MesoVICT international project, the initially aim of this work is to compare the newly tecniques remarking advantages and disadvantages. First of all, the MesoVICT basic examples, represented by synthetic precipitation fields, have been examined. Giving an error evaluation in terms of structure, amplitude and localization of the precipitation fields, the SAL method has been studied more thoroughly respect to the others approaches with its implementation in the core cases of the project. The verification procedure has concerned precipitation fields over central Europe: comparisons between the forecasts performed by the 00z COSMO-2 model and the VERA (Vienna Enhanced Resolution Analysis) have been done. The study of these cases has shown some weaknesses of the methodology examined; in particular has been highlighted the presence of a correlation between the optimal domain size and the extention of the precipitation systems. In order to increase ability of SAL, a subdivision of the original domain in three subdomains has been done and the method has been applied again. Some limits have been found in cases in which at least one of the two domains does not show precipitation. The overall results for the subdomains have been summarized on scatter plots. With the aim to identify systematic errors of the model the variability of the three parameters has been studied for each subdomain.
Resumo:
We propose a method denoted as synthetic portfolio for event studies in market microstructure that is particularly interesting to use with high frequency data and thinly traded markets. The method is based on Synthetic Control Method and provides a robust data driven method to build a counterfactual for evaluating the effects of the volatility call auctions. We find that SMC could be used if the loss function is defined as the difference between the returns of the asset and the returns of a synthetic portfolio. We apply SCM to test the performance of the volatility call auction as a circuit breaker in the context of an event study. We find that for Colombian Stock Market securities, the asynchronicity of intraday data reduces the analysis to a selected group of stocks, however it is possible to build a tracking portfolio. The realized volatility increases after the auction, indicating that the mechanism is not enhancing the price discovery process.