86 resultados para System verification and analysis


Relevância:

100.00% 100.00%

Publicador:

Resumo:

Complex networks have been studied extensively due to their relevance to many real-world systems such as the world-wide web, the internet, biological and social systems. During the past two decades, studies of such networks in different fields have produced many significant results concerning their structures, topological properties, and dynamics. Three well-known properties of complex networks are scale-free degree distribution, small-world effect and self-similarity. The search for additional meaningful properties and the relationships among these properties is an active area of current research. This thesis investigates a newer aspect of complex networks, namely their multifractality, which is an extension of the concept of selfsimilarity. The first part of the thesis aims to confirm that the study of properties of complex networks can be expanded to a wider field including more complex weighted networks. Those real networks that have been shown to possess the self-similarity property in the existing literature are all unweighted networks. We use the proteinprotein interaction (PPI) networks as a key example to show that their weighted networks inherit the self-similarity from the original unweighted networks. Firstly, we confirm that the random sequential box-covering algorithm is an effective tool to compute the fractal dimension of complex networks. This is demonstrated on the Homo sapiens and E. coli PPI networks as well as their skeletons. Our results verify that the fractal dimension of the skeleton is smaller than that of the original network due to the shortest distance between nodes is larger in the skeleton, hence for a fixed box-size more boxes will be needed to cover the skeleton. Then we adopt the iterative scoring method to generate weighted PPI networks of five species, namely Homo sapiens, E. coli, yeast, C. elegans and Arabidopsis Thaliana. By using the random sequential box-covering algorithm, we calculate the fractal dimensions for both the original unweighted PPI networks and the generated weighted networks. The results show that self-similarity is still present in generated weighted PPI networks. This implication will be useful for our treatment of the networks in the third part of the thesis. The second part of the thesis aims to explore the multifractal behavior of different complex networks. Fractals such as the Cantor set, the Koch curve and the Sierspinski gasket are homogeneous since these fractals consist of a geometrical figure which repeats on an ever-reduced scale. Fractal analysis is a useful method for their study. However, real-world fractals are not homogeneous; there is rarely an identical motif repeated on all scales. Their singularity may vary on different subsets; implying that these objects are multifractal. Multifractal analysis is a useful way to systematically characterize the spatial heterogeneity of both theoretical and experimental fractal patterns. However, the tools for multifractal analysis of objects in Euclidean space are not suitable for complex networks. In this thesis, we propose a new box covering algorithm for multifractal analysis of complex networks. This algorithm is demonstrated in the computation of the generalized fractal dimensions of some theoretical networks, namely scale-free networks, small-world networks, random networks, and a kind of real networks, namely PPI networks of different species. Our main finding is the existence of multifractality in scale-free networks and PPI networks, while the multifractal behaviour is not confirmed for small-world networks and random networks. As another application, we generate gene interactions networks for patients and healthy people using the correlation coefficients between microarrays of different genes. Our results confirm the existence of multifractality in gene interactions networks. This multifractal analysis then provides a potentially useful tool for gene clustering and identification. The third part of the thesis aims to investigate the topological properties of networks constructed from time series. Characterizing complicated dynamics from time series is a fundamental problem of continuing interest in a wide variety of fields. Recent works indicate that complex network theory can be a powerful tool to analyse time series. Many existing methods for transforming time series into complex networks share a common feature: they define the connectivity of a complex network by the mutual proximity of different parts (e.g., individual states, state vectors, or cycles) of a single trajectory. In this thesis, we propose a new method to construct networks of time series: we define nodes by vectors of a certain length in the time series, and weight of edges between any two nodes by the Euclidean distance between the corresponding two vectors. We apply this method to build networks for fractional Brownian motions, whose long-range dependence is characterised by their Hurst exponent. We verify the validity of this method by showing that time series with stronger correlation, hence larger Hurst exponent, tend to have smaller fractal dimension, hence smoother sample paths. We then construct networks via the technique of horizontal visibility graph (HVG), which has been widely used recently. We confirm a known linear relationship between the Hurst exponent of fractional Brownian motion and the fractal dimension of the corresponding HVG network. In the first application, we apply our newly developed box-covering algorithm to calculate the generalized fractal dimensions of the HVG networks of fractional Brownian motions as well as those for binomial cascades and five bacterial genomes. The results confirm the monoscaling of fractional Brownian motion and the multifractality of the rest. As an additional application, we discuss the resilience of networks constructed from time series via two different approaches: visibility graph and horizontal visibility graph. Our finding is that the degree distribution of VG networks of fractional Brownian motions is scale-free (i.e., having a power law) meaning that one needs to destroy a large percentage of nodes before the network collapses into isolated parts; while for HVG networks of fractional Brownian motions, the degree distribution has exponential tails, implying that HVG networks would not survive the same kind of attack.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Masonry is one of the most ancient construction materials in the World. When compared to other civil engineering practices, masonry construction is highly labour intensive, which can affect the quality and productivity adversely. With a view to improving quality and in light of the limited skilled labour in the recent times several innovative masonry construction methods such as the dry stack and the thin bed masonry have been developed. This paper focuses on the thin bed masonry system, which is used in many parts of Europe. Thin bed masonry system utilises thin layer of polymer modified mortars connecting the accurately dimensioned and/or interlockable units. This assembly process has the potential for automated panelised construction system in the industry setting or being adopted in the site using less skilled labour, without sacrificing the quality. This is because unlike the conventional masonry construction, the thin bed technology uses thinner mortar (or glue) layer which can be controlled easily through some novel methods described in this paper. Structurally, reduction in the thickness of the mortar joint has beneficial effects; for example it increases the compressive strength of masonry; in addition polymer added glue mortar enhances lateral load capacity relative to conventional masonry. This paper reviews the details of the recent research outcomes on the structural characteristics and construction practices of thin bed masonry. Finally the suitability of thin bed masonry in developing countries where masonry remains as the most common material for residential building construction is discussed.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

In this paper, a class of fractional advection–dispersion models (FADMs) is considered. These models include five fractional advection–dispersion models, i.e., the time FADM, the mobile/immobile time FADM with a time Caputo fractional derivative 0 < γ < 1, the space FADM with two sides Riemann–Liouville derivatives, the time–space FADM and the time fractional advection–diffusion-wave model with damping with index 1 < γ < 2. These equations can be used to simulate the regional-scale anomalous dispersion with heavy tails. We propose computationally effective implicit numerical methods for these FADMs. The stability and convergence of the implicit numerical methods are analysed and compared systematically. Finally, some results are given to demonstrate the effectiveness of theoretical analysis.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

A Maintenance Test Section Survey (MTSS) was conducted as part of a Peer State Review of the Texas Maintenance Program conducted October 5–7, 2010. The purpose of the MTSS was to conduct a field review of 34 highway test sections and obtain participants’ opinions about pavement, roadside, and maintenance conditions. The goal was to cross reference or benchmark TxDOT’s maintenance practices based on practices used by selected peer states. Representatives from six peer states (California, Georgia, Kansas, Missouri, North Carolina, and Washington) were invited to Austin to attend a 3-day Peer State Review of TxDOT Maintenance Practices Workshop and to participate in a field survey of a number of pre-selected one-mile roadway sections. It should be emphasized that the objective of the survey was not to evaluate and grade or score TxDOT’s road network but rather to determine whether the selected roadway sections met acceptable standards of service as perceived by Directors of Maintenance or senior maintenance managers from the peer states...

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Mesenchymal stem cells (MSC) are emerging as a leading cellular therapy for a number of diseases. However, for such treatments to become available as a routine therapeutic option, efficient and cost-effective means for industrial manufacture of MSC are required. At present, clinical grade MSC are manufactured through a process of manual cell culture in specialized cGMP facilities. This process is open, extremely labor intensive, costly, and impractical for anything more than a small number of patients. While it has been shown that MSC can be cultivated in stirred bioreactor systems using microcarriers, providing a route to process scale-up, the degree of numerical expansion achieved has generally been limited. Furthermore, little attention has been given to the issue of primary cell isolation from complex tissues such as placenta. In this article we describe the initial development of a closed process for bulk isolation of MSC from human placenta, and subsequent cultivation on microcarriers in scalable single-use bioreactor systems. Based on our initial data, we estimate that a single placenta may be sufficient to produce over 7,000 doses of therapeutic MSC using a large-scale process.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

This work investigates the accuracy and efficiency tradeoffs between centralized and collective (distributed) algorithms for (i) sampling, and (ii) n-way data analysis techniques in multidimensional stream data, such as Internet chatroom communications. Its contributions are threefold. First, we use the Kolmogorov-Smirnov goodness-of-fit test to show that statistical differences between real data obtained by collective sampling in time dimension from multiple servers and that of obtained from a single server are insignificant. Second, we show using the real data that collective data analysis of 3-way data arrays (users x keywords x time) known as high order tensors is more efficient than centralized algorithms with respect to both space and computational cost. Furthermore, we show that this gain is obtained without loss of accuracy. Third, we examine the sensitivity of collective constructions and analysis of high order data tensors to the choice of server selection and sampling window size. We construct 4-way tensors (users x keywords x time x servers) and analyze them to show the impact of server and window size selections on the results.

Relevância:

100.00% 100.00%

Publicador:

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Reasons for performing study: Many domestic horses and ponies are sedentary and obese due to confinement to small paddocks and stables and a diet of infrequent, high-energy rations. Severe health consequences can be associated with this altered lifestyle. Objectives: The aims of this study were to investigate the ability of horses to learn to use a dynamic feeder system and determine the movement and behavioural responses of horses to the novel system. Methods: A dynamic feed station was developed to encourage horses to exercise in order to access ad libitum hay. Five pairs of horses (n = 10) were studied using a randomised crossover design with each pair studied in a control paddock containing a standard hay feeder and an experimental paddock containing the novel hay feeder. Horse movement was monitored by a global positioning system (GPS) and horses observed and their ability to learn to use the system and the behavioural responses to its use assessed. Results: With initial human intervention all horses used the novel feeder within 1 h. Some aggressive behaviour was observed between horses not well matched in dominance behaviour. The median distance walked by the horses was less (P = 0.002) during a 4 h period (117 [57–185] m) in the control paddock than in the experimental paddock (630 [509–719] m). Conclusions: The use of an automated feeding system promotes increased activity levels in horses housed in small paddocks, compared with a stationary feeder.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Deterministic computer simulations of physical experiments are now common techniques in science and engineering. Often, physical experiments are too time consuming, expensive or impossible to conduct. Complex computer models or codes, rather than physical experiments lead to the study of computer experiments, which are used to investigate many scientific phenomena of this nature. A computer experiment consists of a number of runs of the computer code with different input choices. The Design and Analysis of Computer Experiments is a rapidly growing technique in statistical experimental design. This thesis investigates some practical issues in the design and analysis of computer experiments and attempts to answer some of the questions faced by experimenters using computer experiments. In particular, the question of the number of computer experiments and how they should be augmented is studied and attention is given to when the response is a function over time.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The latest paradigm shift in government, termed Transformational Government, puts the citizen in the centre of attention. Including citizens in the design of online one-stop portals can help governmental organisations to become more customer focussed. This study describes the initial efforts of an Australian state government to develop an information architecture to structure the content of their future one-stop portal. Hereby, card sorting exercises have been conducted and analysed, utilising contemporary approaches found in academic and non-scientific literature. This paper describes the findings of the card sorting exercises in this particular case and discusses the suitability of the applied approaches in general. These are distinguished into non-statistical, statistical, and hybrid approaches. Thus, on the one hand, this paper contributes to academia by describing the application of different card sorting approaches and discussing their strengths and weaknesses. On the other hand, this paper contributes to practice by explaining the approach that has been taken by the authors’ research partner in order to develop a customer-focussed governmental one-stop portal. Thus, they provide decision support for practitioners with regard to different analysis methods that can be used to complement recent approaches in Transformational Government.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Background: Nicotiana benthamiana has been widely used for transient gene expression assays and as a model plant in the study of plant-microbe interactions, lipid engineering and RNA silencing pathways. Assembling the sequence of its transcriptome provides information that, in conjunction with the genome sequence, will facilitate gaining insight into the plant's capacity for high-level transient transgene expression, generation of mobile gene silencing signals, and hyper-susceptibility to viral infection. Methodology/Results: RNA-seq libraries from 9 different tissues were deep sequenced and assembled, de novo, into a representation of the transcriptome. The assembly, of16GB of sequence, yielded 237,340 contigs, clustering into 119,014 transcripts (unigenes). Between 80 and 85% of reads from all tissues could be mapped back to the full transcriptome. Approximately 63% of the unigenes exhibited a match to the Solgenomics tomato predicted proteins database. Approximately 94% of the Solgenomics N. benthamiana unigene set (16,024 sequences) matched our unigene set (119,014 sequences). Using homology searches we identified 31 homologues that are involved in RNAi-associated pathways in Arabidopsis thaliana, and show that they possess the domains characteristic of these proteins. Of these genes, the RNA dependent RNA polymerase gene, Rdr1, is transcribed but has a 72 nt insertion in exon1 that would cause premature termination of translation. Dicer-like 3 (DCL3) appears to lack both the DEAD helicase motif and second dsRNA binding motif, and DCL2 and AGO4b have unexpectedly high levels of transcription. Conclusions: The assembled and annotated representation of the transcriptome and list of RNAi-associated sequences are accessible at www.benthgenome.com alongside a draft genome assembly. These genomic resources will be very useful for further study of the developmental, metabolic and defense pathways of N. benthamiana and in understanding the mechanisms behind the features which have made it such a well-used model plant. © 2013 Nakasugi et al.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The internationalisation process of firms has attracted much research interest since the 1970s. It is noted, however, that a significant research gap exists in studies with a primary focus on the pre-internationalisation behaviour of firms. This paper proposes the incorporation of a pre-internationalisation phase into the traditional Uppsala model of firm internationalisation to address the issue of export readiness. Through extensive literature review, the concepts fundamental to the ability of an Uppsala type firm to begin internationalisation through an export entry mode are identified: exposure to stimuli factors, attitudinal commitment of decision makers towards exporting, the firm’s resource capabilities, as well as the moderating effect of lateral rigidity. The concept of export readiness is operationalised in this study through the construction of an export readiness index (ERI) using exploratory and confirmatory factor analysis. The index is then applied to some representative cases and tested using logistic regression to establish its validity as a diagnostic tool. The proposed ERI presents not only a more practical approach towards analysing firms’ export readiness but has also major public policy implications as a possible tool for government export promotion agencies.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

This paper presents a comprehensive numerical procedure to treat the blast response of laminated glass (LG) panels and studies the influence of important material parameters. Post-crack behaviour of the LG panel and the contribution of the interlayer towards blast resistance are treated. Modelling techniques are validated by comparing with existing experimental results. Findings indicate that the tensile strength of glass considerably influences the blast response of LG panels while the interlayer material properties have a major impact on the response under higher blast loads. Initially, glass panes absorb most of the blast energy, but after the glass breaks, interlayer deforms further and absorbs most of the blast energy. LG panels should be designed to fail by tearing of the interlayer rather than failure at the supports to achieve a desired level of protection. From this aspect, material properties of glass, interlayer and sealant joints play important roles, but unfortunately they are not accounted for in the current design standards. The new information generated in this paper will enhance the capabilities of engineers to better design LG panels under blast loads and use better materials to improve the blast response of LG panels.