978 resultados para Synthetic methods


Relevância:

30.00% 30.00%

Publicador:

Resumo:

Compression ignition (CI) engine design is subject to many constraints which presents a multi-criteria optimisation problem that the engine researcher must solve. In particular, the modern CI engine must not only be efficient, but must also deliver low gaseous, particulate and life cycle greenhouse gas emissions so that its impact on urban air quality, human health, and global warming are minimised. Consequently, this study undertakes a multi-criteria analysis which seeks to identify alternative fuels, injection technologies and combustion strategies that could potentially satisfy these CI engine design constraints. Three datasets are analysed with the Preference Ranking Organization Method for Enrichment Evaluations and Geometrical Analysis for Interactive Aid (PROMETHEE-GAIA) algorithm to explore the impact of 1): an ethanol fumigation system, 2): alternative fuels (20 % biodiesel and synthetic diesel) and alternative injection technologies (mechanical direct injection and common rail injection), and 3): various biodiesel fuels made from 3 feedstocks (i.e. soy, tallow, and canola) tested at several blend percentages (20-100 %) on the resulting emissions and efficiency profile of the various test engines. The results show that moderate ethanol substitutions (~20 % by energy) at moderate load, high percentage soy blends (60-100 %), and alternative fuels (biodiesel and synthetic diesel) provide an efficiency and emissions profile that yields the most “preferred” solutions to this multi-criteria engine design problem. Further research is, however, required to reduce Reactive Oxygen Species (ROS) emissions with alternative fuels, and to deliver technologies that do not significantly reduce the median diameter of particle emissions.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Purpose Small field x-ray beam dosimetry is difficult due to a lack of lateral electronic equilibrium, source occlusion, high dose gradients and detector volume averaging. Currently there is no single definitive detector recommended for small field dosimetry. The objective of this work was to evaluate the performance of a new commercial synthetic diamond detector, namely the PTW 60019 microDiamond, for the dosimetry of small x-ray fields as used in stereotactic radiosurgery (SRS). Methods Small field sizes were defined by BrainLAB circular cones (4 – 30 mm diameter) on a Novalis Trilogy linear accelerator and using the 6 MV SRS x-ray beam mode for all measurements. Percentage depth doses were measured and compared to an IBA SFD and a PTW 60012 E diode. Cross profiles were measured and compared to an IBA SFD diode. Field factors, Ω_(Q_clin,Q_msr)^(f_clin,f_msr ), were calculated by Monte Carlo methods using BEAMnrc and correction factors, k_(Q_clin,Q_msr)^(f_clin,f_msr ), were derived for the PTW 60019 microDiamond detector. Results For the small fields of 4 to 30 mm diameter, there were dose differences in the PDDs of up to 1.5% when compared to an IBA SFD and PTW 60012 E diode detector. For the cross profile measurements the penumbra values varied, depending upon the orientation of the detector. The field factors, Ω_(Q_clin,Q_msr)^(f_clin,f_msr ), were calculated for these field diameters at a depth of 1.4 cm in water and they were within 2.7% of published values for a similar linear accelerator. The corrections factors, k_(Q_clin,Q_msr)^(f_clin,f_msr ), were derived for the PTW 60019 microDiamond detector. Conclusions We conclude that the new PTW 60019 microDiamond detector is generally suitable for relative dosimetry in small 6 MV SRS beams for a Novalis Trilogy linear equipped with circular cones.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This thesis developed a condition assessment and rating method to identify those bridges in a network which are in most need of repair for an effective life cycle management. The method estimates the contribution of critical factors towards bridge deterioration and uses structural analysis to overcome the subjectivity of traditional current condition assessment methods. This research was a part of the CRC project titled 'Life Cycle Management of Railway Bridges'. Efficient usage of resources and enhancing the safety and serviceability of railway bridges are the significant outcomes of using the proposed method.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Having the ability to work with complex models can be highly beneficial, but the computational cost of doing so is often large. Complex models often have intractable likelihoods, so methods that directly use the likelihood function are infeasible. In these situations, the benefits of working with likelihood-free methods become apparent. Likelihood-free methods, such as parametric Bayesian indirect likelihood that uses the likelihood of an alternative parametric auxiliary model, have been explored throughout the literature as a good alternative when the model of interest is complex. One of these methods is called the synthetic likelihood (SL), which assumes a multivariate normal approximation to the likelihood of a summary statistic of interest. This paper explores the accuracy and computational efficiency of the Bayesian version of the synthetic likelihood (BSL) approach in comparison to a competitor known as approximate Bayesian computation (ABC) and its sensitivity to its tuning parameters and assumptions. We relate BSL to pseudo-marginal methods and propose to use an alternative SL that uses an unbiased estimator of the exact working normal likelihood when the summary statistic has a multivariate normal distribution. Several applications of varying complexity are considered to illustrate the findings of this paper.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Large-scale chromosome rearrangements such as copy number variants (CNVs) and inversions encompass a considerable proportion of the genetic variation between human individuals. In a number of cases, they have been closely linked with various inheritable diseases. Single-nucleotide polymorphisms (SNPs) are another large part of the genetic variance between individuals. They are also typically abundant and their measuring is straightforward and cheap. This thesis presents computational means of using SNPs to detect the presence of inversions and deletions, a particular variety of CNVs. Technically, the inversion-detection algorithm detects the suppressed recombination rate between inverted and non-inverted haplotype populations whereas the deletion-detection algorithm uses the EM-algorithm to estimate the haplotype frequencies of a window with and without a deletion haplotype. As a contribution to population biology, a coalescent simulator for simulating inversion polymorphisms has been developed. Coalescent simulation is a backward-in-time method of modelling population ancestry. Technically, the simulator also models multiple crossovers by using the Counting model as the chiasma interference model. Finally, this thesis includes an experimental section. The aforementioned methods were tested on synthetic data to evaluate their power and specificity. They were also applied to the HapMap Phase II and Phase III data sets, yielding a number of candidates for previously unknown inversions, deletions and also correctly detecting known such rearrangements.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Matrix decompositions, where a given matrix is represented as a product of two other matrices, are regularly used in data mining. Most matrix decompositions have their roots in linear algebra, but the needs of data mining are not always those of linear algebra. In data mining one needs to have results that are interpretable -- and what is considered interpretable in data mining can be very different to what is considered interpretable in linear algebra. --- The purpose of this thesis is to study matrix decompositions that directly address the issue of interpretability. An example is a decomposition of binary matrices where the factor matrices are assumed to be binary and the matrix multiplication is Boolean. The restriction to binary factor matrices increases interpretability -- factor matrices are of the same type as the original matrix -- and allows the use of Boolean matrix multiplication, which is often more intuitive than normal matrix multiplication with binary matrices. Also several other decomposition methods are described, and the computational complexity of computing them is studied together with the hardness of approximating the related optimization problems. Based on these studies, algorithms for constructing the decompositions are proposed. Constructing the decompositions turns out to be computationally hard, and the proposed algorithms are mostly based on various heuristics. Nevertheless, the algorithms are shown to be capable of finding good results in empirical experiments conducted with both synthetic and real-world data.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Arabinomannan-containing glycolipids, relevant to the mycobacterial cell-wall component lipoarabinomannan, were synthesized by chemical methods. The glycolipids were presented with tri- and tetrasaccharide arabinomannans as the sugar portion and a double alkyl chain as the lyophilic portion. Following synthesis, systematic biological and biophysical studies were undertaken in order to identify the effects of the glycolipids during mycobacterium growth. The studies included mycobacterial growth, biofilm formation and motility assays. From the studies, it was observed that the synthetic glycolipid with higher arabinan residues inhibited the mycobacterial growth, lessened the biofilm formation and impaired the motility of mycobacteria. A surface plasmon resonance study involving the immobilized glycan surface and the mycobacterial crude lysates as analytes showed specificities of the interactions. Further, it was found that cell lysates from motile bacteria bound oligosaccharide with higher affinity than non-motile bacteria.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Synthetic aperture radar (SAR) is a powerful tool for mapping and remote sensing. The theory and operation of SAR have seen a period of intense activity in recent years. This paper attempts to review some of the more advanced topics studied in connection with modern SAR systems based on digital processing. Following a brief review of the principles involved in the operation of SAR, attention is focussed on special topics such as advanced SAR modelling and focussing techniques, in particular clutterlock and autofocus, Doppler centroid (DC) estimation methods involving seismic migration technique, moving target imaging, bistatic radar imaging, effects of system nonlinearities, etc.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

In this paper we study the problem of designing SVM classifiers when the kernel matrix, K, is affected by uncertainty. Specifically K is modeled as a positive affine combination of given positive semi definite kernels, with the coefficients ranging in a norm-bounded uncertainty set. We treat the problem using the Robust Optimization methodology. This reduces the uncertain SVM problem into a deterministic conic quadratic problem which can be solved in principle by a polynomial time Interior Point (IP) algorithm. However, for large-scale classification problems, IP methods become intractable and one has to resort to first-order gradient type methods. The strategy we use here is to reformulate the robust counterpart of the uncertain SVM problem as a saddle point problem and employ a special gradient scheme which works directly on the convex-concave saddle function. The algorithm is a simplified version of a general scheme due to Juditski and Nemirovski (2011). It achieves an O(1/T-2) reduction of the initial error after T iterations. A comprehensive empirical study on both synthetic data and real-world protein structure data sets show that the proposed formulations achieve the desired robustness, and the saddle point based algorithm outperforms the IP method significantly.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Flood is one of the detrimental hydro-meteorological threats to mankind. This compels very efficient flood assessment models. In this paper, we propose remote sensing based flood assessment using Synthetic Aperture Radar (SAR) image because of its imperviousness to unfavourable weather conditions. However, they suffer from the speckle noise. Hence, the processing of SAR image is applied in two stages: speckle removal filters and image segmentation methods for flood mapping. The speckle noise has been reduced with the help of Lee, Frost and Gamma MAP filters. A performance comparison of these speckle removal filters is presented. From the results obtained, we deduce that the Gamma MAP is reliable. The selected Gamma MAP filtered image is segmented using Gray Level Co-occurrence Matrix (GLCM) and Mean Shift Segmentation (MSS). The GLCM is a texture analysis method that separates the image pixels into water and non-water groups based on their spectral feature whereas MSS is a gradient ascent method, here segmentation is carried out using spectral and spatial information. As test case, Kosi river flood is considered in our study. From the segmentation result of both these methods are comprehensively analysed and concluded that the MSS is efficient for flood mapping.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

In this paper, we propose a super resolution (SR) method for synthetic images using FeatureMatch. Existing state-of-the-art super resolution methods are learning based methods, where a pair of low-resolution and high-resolution dictionary pair are trained, and this trained pair is used to replace patches in low-resolution image with appropriate matching patches from the high-resolution dictionary. In this paper, we show that by using Approximate Nearest Neighbour Fields (ANNF), and a common source image, we can by-pass the learning phase, and use a single image for dictionary. Thus, reducing the dictionary from a collection obtained from hundreds of training images, to a single image. We show that by modifying the latest developments in ANNF computation, to suit super resolution, we can perform much faster and more accurate SR than existing techniques. To establish this claim we will compare our algorithm against various state-of-the-art algorithms, and show that we are able to achieve better and faster reconstruction without any training phase.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The reactivity of permethylzirconocene and permethylhafnocene complexes with various nucleophiles has been investigated. Permethylzirconocene reacts with sterically hindered ketenes and allenes to afford metallacycle products. Reaction of these cummulenes with permethylzirconocene hydride complexes affords enolate and σ-allyl species, respectively. Reactions which afford enolate products are nonstereospecific, whereas reactions which afford allyl products initially give a cis-σ-allyl complex which rearranges to its trans isomer. The mechanism of these reactions is proposed to occur either by a Lewis Acid-Lewis Base interaction (ketenes) or by formation of a π-olefin intermediate (allenes).

Permethylzirconocene haloacyl complexes react with strong bases such as lithium diisopropylamide or methylene trimethylphosphorane to afford ketene compounds. Depending on the size of the alkyl ketene substituent, the hydrogenation of these compounds affords enolate-hydride products with varying degrees of stereoselectivity. The larger the substituent, the greater is the selectivity for cis hydrogenation products.

The reaction of permethylzirconocene dihydride and permethylhafnocene dihydride with methylene trimethylphosphorane affords methyl-hydride and dimethyl derivatives. Under appropriate conditions, the metallated-ylide complex 1, (η^5-C_5(CH_3)_5)_2 Zr(H)CH_2PMe_2CH_2, is also obtained and has been structurally characterized by X-ray diffraction techniques. Reaction of 1 with CO affords (η^5-C_5(CH_3)_5)_2 Zr(C,O-η^2 -(PMe_3)HC=CO)H which exists in solution as an equilibrium mixture of isomers. In one isomer (2), the η^2-acyl oxygen atom occupies a lateral equatorial coordination position about zirconium, whereas in the other isomer (3), the η-acyl oxygen atom occupies the central equatorial position. The equilibrium kinetics of the 2→3 isomerization have been studied and the structures of both complexes confirmed by X-ray diffraction methods. These studies suggest a mechanism for CO insertion into metal-carbon bonds of the early transition metals.

Permethylhafnocene dihydride and permethylzirconocene hydride complexes react with diazoalkanes to afford η^2-N, N' -hydrazonido species in which the terminal nitrogen atom of the diazoalkane molecule has inserted into a metal-hydride or metal-carbon bond. The structure of one of these compounds, Cp*_2Zr(NMeNCTol_2)OH, has been determined by X-ray diffraction techniques. Under appropriate conditions, the hydrazonido-hydride complexes react with a second equivalent of diazoalkene to afford η' -N-hydrazonido-η^2-N, N' -hydrazonido species.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This study addresses the problem of obtaining reliable velocities and displacements from accelerograms, a concern which often arises in earthquake engineering. A closed-form acceleration expression with random parameters is developed to test any strong-motion accelerogram processing method. Integration of this analytical time history yields the exact velocities, displacements and Fourier spectra. Noise and truncation can also be added. A two-step testing procedure is proposed and the original Volume II routine is used as an illustration. The main sources of error are identified and discussed. Although these errors may be reduced, it is impossible to extract the true time histories from an analog or digital accelerogram because of the uncertain noise level and missing data. Based on these uncertainties, a probabilistic approach is proposed as a new accelerogram processing method. A most probable record is presented as well as a reliability interval which reflects the level of error-uncertainty introduced by the recording and digitization process. The data is processed in the frequency domain, under assumptions governing either the initial value or the temporal mean of the time histories. This new processing approach is tested on synthetic records. It induces little error and the digitization noise is adequately bounded. Filtering is intended to be kept to a minimum and two optimal error-reduction methods are proposed. The "noise filters" reduce the noise level at each harmonic of the spectrum as a function of the signal-to-noise ratio. However, the correction at low frequencies is not sufficient to significantly reduce the drifts in the integrated time histories. The "spectral substitution method" uses optimization techniques to fit spectral models of near-field, far-field or structural motions to the amplitude spectrum of the measured data. The extremes of the spectrum of the recorded data where noise and error prevail are then partly altered, but not removed, and statistical criteria provide the choice of the appropriate cutoff frequencies. This correction method has been applied to existing strong-motion far-field, near-field and structural data with promising results. Since this correction method maintains the whole frequency range of the record, it should prove to be very useful in studying the long-period dynamics of local geology and structures.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Representatives from the family of Lemnaceae (duckweed) make ideal experimental material for research into a succession of phytophysiological processes with regard to growth rate and vegetative reproduction. They are also easy to maintain in sterile cultures. Lemnaceae belong to the higher flowering plants (flowers are rarely produced), however they are distinguished by a much simplified morphological and anatomical structure. As water plants they possess the advantage, that they can be cultivated in synthetic media under laboratory conditions controlled by the application of both a known light intensity and temperature. This paper describes experimental research of growth of Lemnaceae in different conditions. Some of the variables were mineral media, illumination and aeration.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Biomolecular circuit engineering is critical for implementing complex functions in vivo, and is a baseline method in the synthetic biology space. However, current methods for conducting biomolecular circuit engineering are time-consuming and tedious. A complete design-build-test cycle typically takes weeks' to months' time due to the lack of an intermediary between design ex vivo and testing in vivo. In this work, we explore the development and application of a "biomolecular breadboard" composed of an in-vitro transcription-translation (TX-TL) lysate to rapidly speed up the engineering design-build-test cycle. We first developed protocols for creating and using lysates for conducting biological circuit design. By doing so we simplified the existing technology to an affordable ($0.03/uL) and easy to use three-tube reagent system. We then developed tools to accelerate circuit design by allowing for linear DNA use in lieu of plasmid DNA, and by utilizing principles of modular assembly. This allowed the design-build-test cycle to be reduced to under a business day. We then characterized protein degradation dynamics in the breadboard to aid to implementing complex circuits. Finally, we demonstrated that the breadboard could be applied to engineer complex synthetic circuits in vitro and in vivo. Specifically, we utilized our understanding of linear DNA prototyping, modular assembly, and protein degradation dynamics to characterize the repressilator oscillator and to prototype novel three- and five-node negative feedback oscillators both in vitro and in vivo. We therefore believe the biomolecular breadboard has wide application for acting as an intermediary for biological circuit engineering.