987 resultados para MATRIX-ELEMENTS


Relevância:

20.00% 20.00%

Publicador:

Resumo:

This thesis explores innovation as a means to achieving an enhanced level of sustainability in the Australian water sector. A modified Delphi study uncovered sixteen key elements centred around themes of 'community acceptance' and 'innovator effectiveness', that provide insights for immediate application within the sector to address impacts of climate change, population increases and resource scarcity. This exploratory research builds a foundational understanding of the components for change and innovation within the Australian water sector, and forms the underpinning for more specific lines of enquiry.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Matrix decompositions, where a given matrix is represented as a product of two other matrices, are regularly used in data mining. Most matrix decompositions have their roots in linear algebra, but the needs of data mining are not always those of linear algebra. In data mining one needs to have results that are interpretable -- and what is considered interpretable in data mining can be very different to what is considered interpretable in linear algebra. --- The purpose of this thesis is to study matrix decompositions that directly address the issue of interpretability. An example is a decomposition of binary matrices where the factor matrices are assumed to be binary and the matrix multiplication is Boolean. The restriction to binary factor matrices increases interpretability -- factor matrices are of the same type as the original matrix -- and allows the use of Boolean matrix multiplication, which is often more intuitive than normal matrix multiplication with binary matrices. Also several other decomposition methods are described, and the computational complexity of computing them is studied together with the hardness of approximating the related optimization problems. Based on these studies, algorithms for constructing the decompositions are proposed. Constructing the decompositions turns out to be computationally hard, and the proposed algorithms are mostly based on various heuristics. Nevertheless, the algorithms are shown to be capable of finding good results in empirical experiments conducted with both synthetic and real-world data.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

This thesis presents methods for locating and analyzing cis-regulatory DNA elements involved with the regulation of gene expression in multicellular organisms. The regulation of gene expression is carried out by the combined effort of several transcription factor proteins collectively binding the DNA on the cis-regulatory elements. Only sparse knowledge of the 'genetic code' of these elements exists today. An automatic tool for discovery of putative cis-regulatory elements could help their experimental analysis, which would result in a more detailed view of the cis-regulatory element structure and function. We have developed a computational model for the evolutionary conservation of cis-regulatory elements. The elements are modeled as evolutionarily conserved clusters of sequence-specific transcription factor binding sites. We give an efficient dynamic programming algorithm that locates the putative cis-regulatory elements and scores them according to the conservation model. A notable proportion of the high-scoring DNA sequences show transcriptional enhancer activity in transgenic mouse embryos. The conservation model includes four parameters whose optimal values are estimated with simulated annealing. With good parameter values the model discriminates well between the DNA sequences with evolutionarily conserved cis-regulatory elements and the DNA sequences that have evolved neutrally. In further inquiry, the set of highest scoring putative cis-regulatory elements were found to be sensitive to small variations in the parameter values. The statistical significance of the putative cis-regulatory elements is estimated with the Two Component Extreme Value Distribution. The p-values grade the conservation of the cis-regulatory elements above the neutral expectation. The parameter values for the distribution are estimated by simulating the neutral DNA evolution. The conservation of the transcription factor binding sites can be used in the upstream analysis of regulatory interactions. This approach may provide mechanistic insight to the transcription level data from, e.g., microarray experiments. Here we give a method to predict shared transcriptional regulators for a set of co-expressed genes. The EEL (Enhancer Element Locator) software implements the method for locating putative cis-regulatory elements. The software facilitates both interactive use and distributed batch processing. We have used it to analyze the non-coding regions around all human genes with respect to the orthologous regions in various other species including mouse. The data from these genome-wide analyzes is stored in a relational database which is used in the publicly available web services for upstream analysis and visualization of the putative cis-regulatory elements in the human genome.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

We propose a new type of high-order elements that incorporates the mesh-free Galerkin formulations into the framework of finite element method. Traditional polynomial interpolation is replaced by mesh-free interpolations in the present high-order elements, and the strain smoothing technique is used for integration of the governing equations based on smoothing cells. The properties of high-order elements, which are influenced by the basis function of mesh-free interpolations and boundary nodes, are discussed through numerical examples. It can be found that the basis function has significant influence on the computational accuracy and upper-lower bounds of energy norm, when the strain smoothing technique retains the softening phenomenon. This new type of high-order elements shows good performance when quadratic basis functions are used in the mesh-free interpolations and present elements prove advantageous in adaptive mesh and nodes refinement schemes. Furthermore, it shows less sensitive to the quality of element because it uses the mesh-free interpolations and obeys the Weakened Weak (W2) formulation as introduced in [3, 5].

Relevância:

20.00% 20.00%

Publicador:

Resumo:

We incorporate various gold nanoparticles (AuNPs) capped with different ligands in two-dimensional films and three-dimensional aggregates derived from N-stearoyl-L-alanine and N-lauroyl-L-alanine, respectively. The assemblies of N-stearoyl-L-alanine afforded stable films at the air-water interface. More compact assemblies were formed upon incorporation of AuNPs in the air-water interface of N-stearoyl-L-alanine. We then examined the effects of incorporation of various AuNPs functionalized with different capping ligands in three-dimensional assemblies of N-lauroyl-L-alanine, a compound that formed a gel in hydrocarbons. The profound influence of nanoparticle incorporation into physical gels was evident from evaluation of various microscopic and bulk properties. The interaction of AuNPs with the gelator assembly was found to depend critically on the capping ligands protecting the Au surface of the gold nanoparticles. Transmission electron microscopy (TEM) showed a long-range directional assembly of certain AuNPs along the gel fibers. Scanning electron microscopy (SEM) images of the freeze-dried gels and nanocomposites indicate that the morphological transformation in the composite microstructures depends significantly on the capping agent of the nanoparticles. Differential scanning calorimetry (DSC) showed that gel formation from sol occurred at a lower temperature upon incorporation of AuNPs having capping ligands that were able to align and noncovalently interact with the gel fibers. Rheological studies indicate that the gel-nanoparticle composites exhibit significantly greater viscoelasticity compared to the native gel alone when the capping ligands are able to interact through interdigitation into the gelator assembly. Thus, it was possible to define a clear relationship between the materials and the molecular-level properties by means of manipulation of the information inscribed on the NP surface.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

An iterative algorithm baaed on probabilistic estimation is described for obtaining the minimum-norm solution of a very large, consistent, linear system of equations AX = g where A is an (m times n) matrix with non-negative elements, x and g are respectively (n times 1) and (m times 1) vectors with positive components.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Purification of drinking water is routinely achieved by use of conventional coagulants and disinfection procedures. However, there are instances such as flood events when the level of turbidity reaches extreme levels while NOM may be an issue throughout the year. Consequently, there is a need to develop technologies which can effectively treat water of high turbidity during flood events and natural organic matter (NOM) content year round. It was our hypothesis that pebble matrix filtration potentially offered a relatively cheap, simple and reliable means to clarify such challenging water samples. Therefore, a laboratory scale pebble matrix filter (PMF) column was used to evaluate the turbidity and natural organic matter (NOM) pre-treatment performance in relation to 2013 Brisbane River flood water. Since the high turbidity was only a seasonal and short term problem, the general applicability of pebble matrix filters for NOM removal was also investigated. A 1.0 m deep bed of pebbles (the matrix) partly in-filled with either sand or crushed glass was tested, upon which was situated a layer of granular activated carbon (GAC). Turbidity was measured as a surrogate for suspended solids (SS), whereas, total organic carbon (TOC) and UV Absorbance at 254 nm were measured as surrogate parameters for NOM. Experiments using natural flood water showed that without the addition of any chemical coagulants, PMF columns achieved at least 50% turbidity reduction when the source water contained moderate hardness levels. For harder water samples, above 85% turbidity reduction was obtained. The ability to remove 50% turbidity without chemical coagulants may represent significant cost savings to water treatment plants and added environmental benefits accrue due to less sludge formation. A TOC reduction of 35-47% and UV-254 nm reduction of 24-38% was also observed. In addition to turbidity removal during flood periods, the ability to remove NOM using the pebble matrix filter throughout the year may have the benefit of reducing disinfection by-products (DBP) formation potential and coagulant demand at water treatment plants. Final head losses were remarkably low, reaching only 11 cm at a filtration velocity of 0.70 m/h.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Poly(vinyl alcohol)-matrix reinforced with nanodiamond (ND) particles, with ND content up to 0.6 wt%, were synthesized. Characterization of the composites by transmission electron microscopy (TEM) and small angle X-ray scattering (SAXS) reveal uniform distribution of the ND particles with no agglomeration in the matrix. Differential scanning calorimetry reveals that the crystallinity of the polymer increases with increasing ND content, indicating a strong interaction between ND and PVA. Nano-indentation technique was employed to assess the mechanical properties of composites. Results show that even small additions of ND lead to significant enhancement in the hardness and elastic modulus of PVA. Possible micromechanisms responsible for the enhancement of the mechanical properties are discussed.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Mxr1p (methanol expression regulator 1) functions as a key regulator of methanol metabolism in the methylotrophic yeast Pichia pastoris. In this study, a recombinant Mxr1p protein containing the N-terminal zinc finger DNA binding domain was overexpressed and purified from E coli cells and its ability to bind to promoter sequences of AOXI encoding alcohol oxidase was examined. In the AOXI promoter, Mxr1p binds at six different regions. Deletions encompassing these regions result in a significant decrease in AOXI promoter activity in vivo. Based on the analysis of AOXI promoter sequences, a consensus sequence for Mxr1p binding consisting of a core 5' CYCC 3' motif was identified. When the core CYCC sequence is mutated to CYCA, CYCT or CYCM (M = 5-methylcytosine), Mxr1p binding is abolished. Though Mxr1p is the homologue of Saccharomyces cerevisiae Adr1p transcription factor, it does not bind to Adr1p binding site of S. cerevisiae alcohol dehydrogenase promoter (ADH2UAS1). However, two point mutations convert ADH2UAS1 into an Mxr1p binding site. The identification of key DNA elements involved in promoter recognition by Mxr1p is an important step in understanding its function as a master regulator of the methanol utilization pathway in P. pastoris.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Volatility is central in options pricing and risk management. It reflects the uncertainty of investors and the inherent instability of the economy. Time series methods are among the most widely applied scientific methods to analyze and predict volatility. Very frequently sampled data contain much valuable information about the different elements of volatility and may ultimately reveal the reasons for time varying volatility. The use of such ultra-high-frequency data is common to all three essays of the dissertation. The dissertation belongs to the field of financial econometrics. The first essay uses wavelet methods to study the time-varying behavior of scaling laws and long-memory in the five-minute volatility series of Nokia on the Helsinki Stock Exchange around the burst of the IT-bubble. The essay is motivated by earlier findings which suggest that different scaling laws may apply to intraday time-scales and to larger time-scales, implying that the so-called annualized volatility depends on the data sampling frequency. The empirical results confirm the appearance of time varying long-memory and different scaling laws that, for a significant part, can be attributed to investor irrationality and to an intraday volatility periodicity called the New York effect. The findings have potentially important consequences for options pricing and risk management that commonly assume constant memory and scaling. The second essay investigates modelling the duration between trades in stock markets. Durations convoy information about investor intentions and provide an alternative view at volatility. Generalizations of standard autoregressive conditional duration (ACD) models are developed to meet needs observed in previous applications of the standard models. According to the empirical results based on data of actively traded stocks on the New York Stock Exchange and the Helsinki Stock Exchange the proposed generalization clearly outperforms the standard models and also performs well in comparison to another recently proposed alternative to the standard models. The distribution used to derive the generalization may also prove valuable in other areas of risk management. The third essay studies empirically the effect of decimalization on volatility and market microstructure noise. Decimalization refers to the change from fractional pricing to decimal pricing and it was carried out on the New York Stock Exchange in January, 2001. The methods used here are more accurate than in the earlier studies and put more weight on market microstructure. The main result is that decimalization decreased observed volatility by reducing noise variance especially for the highly active stocks. The results help risk management and market mechanism designing.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Abstract is not available.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Among the iterative schemes for computing the Moore — Penrose inverse of a woll-conditioned matrix, only those which have an order of convergence three or two are computationally efficient. A Fortran programme for these schemes is provided.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The juvenile sea squirt wanders through the sea searching for a suitable rock or hunk of coral to cling to and make its home for life. For this task it has a rudimentary nervous system. When it finds its spot and takes root, it doesn't need its brain any more so it eats it. It's rather like getting tenure. Daniel C. Dennett (from Consciousness Explained, 1991) The little sea squirt needs its brain for a task that is very simple and short. When the task is completed, the sea squirt starts a new life in a vegetative state, after having a nourishing meal. The little brain is more tightly structured than our massive primate brains. The number of neurons is exact, no leeway in neural proliferation is tolerated. Each neuroblast migrates exactly to the correct position, and only a certain number of connections with the right companions is allowed. In comparison, growth of a mammalian brain is a merry mess. The reason is obvious: Squirt brain needs to perform only a few, predictable functions, before becoming waste. The more mobile and complex mammals engage their brains in tasks requiring quick adaptation and plasticity in a constantly changing environment. Although the regulation of nervous system development varies between species, many regulatory elements remain the same. For example, all multicellular animals possess a collection of proteoglycans (PG); proteins with attached, complex sugar chains called glycosaminoglycans (GAG). In development, PGs participate in the organization of the animal body, like in the construction of parts of the nervous system. The PGs capture water with their GAG chains, forming a biochemically active gel at the surface of the cell, and in the extracellular matrix (ECM). In the nervous system, this gel traps inside it different molecules: growth factors and ECM-associated proteins. They regulate the proliferation of neural stem cells (NSC), guide the migration of neurons, and coordinate the formation of neuronal connections. In this work I have followed the role of two molecules contributing to the complexity of mammalian brain development. N-syndecan is a transmembrane heparan sulfate proteoglycan (HSPG) with cell signaling functions. Heparin-binding growth-associated molecule (HB-GAM) is an ECM-associated protein with high expression in the perinatal nervous system, and high affinity to HS and heparin. N-syndecan is a receptor for several growth factors and for HB-GAM. HB-GAM induces specific signaling via N-syndecan, activating c-Src, calcium/calmodulin-dependent serine protein kinase (CASK) and cortactin. By studying the gene knockouts of HB-GAM and N-syndecan in mice, I have found that HB-GAM and N-syndecan are involved as a receptor-ligand-pair in neural migration and differentiation. HB-GAM competes with the growth factors fibriblast growth factor (FGF)-2 and heparin-binding epidermal growth factor (HB-EGF) in HS-binding, causing NSCs to stop proliferation and to differentiate, and affects HB-EGF-induced EGF receptor (EGFR) signaling in neural cells during migration. N-syndecan signaling affects the motility of young neurons, by boosting EGFR-mediated cell migration. In addition, these two receptors form a complex at the surface of the neurons, probably creating a motility-regulating structure.