877 resultados para Analysis tools
Resumo:
Complex networks have been studied extensively due to their relevance to many real-world systems such as the world-wide web, the internet, biological and social systems. During the past two decades, studies of such networks in different fields have produced many significant results concerning their structures, topological properties, and dynamics. Three well-known properties of complex networks are scale-free degree distribution, small-world effect and self-similarity. The search for additional meaningful properties and the relationships among these properties is an active area of current research. This thesis investigates a newer aspect of complex networks, namely their multifractality, which is an extension of the concept of selfsimilarity. The first part of the thesis aims to confirm that the study of properties of complex networks can be expanded to a wider field including more complex weighted networks. Those real networks that have been shown to possess the self-similarity property in the existing literature are all unweighted networks. We use the proteinprotein interaction (PPI) networks as a key example to show that their weighted networks inherit the self-similarity from the original unweighted networks. Firstly, we confirm that the random sequential box-covering algorithm is an effective tool to compute the fractal dimension of complex networks. This is demonstrated on the Homo sapiens and E. coli PPI networks as well as their skeletons. Our results verify that the fractal dimension of the skeleton is smaller than that of the original network due to the shortest distance between nodes is larger in the skeleton, hence for a fixed box-size more boxes will be needed to cover the skeleton. Then we adopt the iterative scoring method to generate weighted PPI networks of five species, namely Homo sapiens, E. coli, yeast, C. elegans and Arabidopsis Thaliana. By using the random sequential box-covering algorithm, we calculate the fractal dimensions for both the original unweighted PPI networks and the generated weighted networks. The results show that self-similarity is still present in generated weighted PPI networks. This implication will be useful for our treatment of the networks in the third part of the thesis. The second part of the thesis aims to explore the multifractal behavior of different complex networks. Fractals such as the Cantor set, the Koch curve and the Sierspinski gasket are homogeneous since these fractals consist of a geometrical figure which repeats on an ever-reduced scale. Fractal analysis is a useful method for their study. However, real-world fractals are not homogeneous; there is rarely an identical motif repeated on all scales. Their singularity may vary on different subsets; implying that these objects are multifractal. Multifractal analysis is a useful way to systematically characterize the spatial heterogeneity of both theoretical and experimental fractal patterns. However, the tools for multifractal analysis of objects in Euclidean space are not suitable for complex networks. In this thesis, we propose a new box covering algorithm for multifractal analysis of complex networks. This algorithm is demonstrated in the computation of the generalized fractal dimensions of some theoretical networks, namely scale-free networks, small-world networks, random networks, and a kind of real networks, namely PPI networks of different species. Our main finding is the existence of multifractality in scale-free networks and PPI networks, while the multifractal behaviour is not confirmed for small-world networks and random networks. As another application, we generate gene interactions networks for patients and healthy people using the correlation coefficients between microarrays of different genes. Our results confirm the existence of multifractality in gene interactions networks. This multifractal analysis then provides a potentially useful tool for gene clustering and identification. The third part of the thesis aims to investigate the topological properties of networks constructed from time series. Characterizing complicated dynamics from time series is a fundamental problem of continuing interest in a wide variety of fields. Recent works indicate that complex network theory can be a powerful tool to analyse time series. Many existing methods for transforming time series into complex networks share a common feature: they define the connectivity of a complex network by the mutual proximity of different parts (e.g., individual states, state vectors, or cycles) of a single trajectory. In this thesis, we propose a new method to construct networks of time series: we define nodes by vectors of a certain length in the time series, and weight of edges between any two nodes by the Euclidean distance between the corresponding two vectors. We apply this method to build networks for fractional Brownian motions, whose long-range dependence is characterised by their Hurst exponent. We verify the validity of this method by showing that time series with stronger correlation, hence larger Hurst exponent, tend to have smaller fractal dimension, hence smoother sample paths. We then construct networks via the technique of horizontal visibility graph (HVG), which has been widely used recently. We confirm a known linear relationship between the Hurst exponent of fractional Brownian motion and the fractal dimension of the corresponding HVG network. In the first application, we apply our newly developed box-covering algorithm to calculate the generalized fractal dimensions of the HVG networks of fractional Brownian motions as well as those for binomial cascades and five bacterial genomes. The results confirm the monoscaling of fractional Brownian motion and the multifractality of the rest. As an additional application, we discuss the resilience of networks constructed from time series via two different approaches: visibility graph and horizontal visibility graph. Our finding is that the degree distribution of VG networks of fractional Brownian motions is scale-free (i.e., having a power law) meaning that one needs to destroy a large percentage of nodes before the network collapses into isolated parts; while for HVG networks of fractional Brownian motions, the degree distribution has exponential tails, implying that HVG networks would not survive the same kind of attack.
Resumo:
Much has been said and documented about the key role that reflection can play in the ongoing development of e-portfolios, particularly e-portfolios utilised for teaching and learning. A review of e-portfolio platforms reveals that a designated space for documenting and collating personal reflections is a typical design feature of both open source and commercial off-the-shelf software. Further investigation of tools within e-portfolio systems for facilitating reflection reveals that, apart from enabling personal journalism through blogs or other writing, scaffolding tools that encourage the actual process of reflection are under-developed. Investigation of a number of prominent e-portfolio projects also reveals that reflection, while presented as critically important, is often viewed as an activity that takes place after a learning activity or experience and not intrinsic to it. This paper assumes an alternative, richer conception of reflection: a process integral to a wide range of activities associated with learning, such as inquiry, communication, editing, analysis and evaluation. Such a conception is consistent with the literature associated with ‘communities of practice’, which is replete with insight into ‘learning through doing’, and with a ‘whole minded’ approach to inquiry. Thus, graduates who are ‘reflective practitioners’ who integrate reflection into their learning will have more to offer a prospective employer than graduates who have adopted an episodic approach to reflection. So, what kinds of tools might facilitate integrated reflection? This paper outlines a number of possibilities for consideration and development. Such tools do not have to be embedded within e-portfolio systems, although there are benefits in doing so. In order to inform future design of e-portfolio systems this paper presents a faceted model of knowledge creation that depicts an ‘ecology of knowing’ in which interaction with, and the production of, learning content is deepened through the construction of well-formed questions of that content. In particular, questions that are initiated by ‘why’ are explored because they are distinguished from the other ‘journalist’ questions (who, what, when, where, and where) in that answers to them demand explanative, as opposed to descriptive, content. They require a rationale. Although why questions do not belong to any one genre and are not simple to classify — responses can contain motivational, conditional, causal, and/or existential content — they do make a difference in the acquisition of understanding. The development of scaffolding that builds on why-questioning to enrich learning is the motivation behind the research that has informed this paper.
Resumo:
Sound Thinking provides techniques and approaches to critically listen, think, talk and write about music you hear or make. It provides tips on making music and it encourages regular and deep thinking about music activities, which helps build a musical dialog that leads to deeper understanding.
Resumo:
Post-deployment maintenance and evolution can account for up to 75% of the cost of developing a software system. Software refactoring can reduce the costs associated with evolution by improving system quality. Although refactoring can yield benefits, the process includes potentially complex, error-prone, tedious and time-consuming tasks. It is these tasks that automated refactoring tools seek to address. However, although the refactoring process is well-defined, current refactoring tools do not support the full process. To develop better automated refactoring support, we have completed a usability study of software refactoring tools. In the study, we analysed the task of software refactoring using the ISO 9241-11 usability standard and Fitts' List of task allocation. Expanding on this analysis, we reviewed 11 collections of usability guidelines and combined these into a single list of 38 guidelines. From this list, we developed 81 usability requirements for refactoring tools. Using these requirements, the software refactoring tools Eclipse 3.2, Condenser 1.05, RefactorIT 2.5.1, and Eclipse 3.2 with the Simian UI 2.2.12 plugin were studied. Based on the analysis, we have selected a subset of the requirements that can be incorporated into a prototype refactoring tool intended to address the full refactoring process.
Resumo:
During the course of several natural disasters in recent years, Twitter has been found to play an important role as an additional medium for many–to–many crisis communication. Emergency services are successfully using Twitter to inform the public about current developments, and are increasingly also attempting to source first–hand situational information from Twitter feeds (such as relevant hashtags). The further study of the uses of Twitter during natural disasters relies on the development of flexible and reliable research infrastructure for tracking and analysing Twitter feeds at scale and in close to real time, however. This article outlines two approaches to the development of such infrastructure: one which builds on the readily available open source platform yourTwapperkeeper to provide a low–cost, simple, and basic solution; and, one which establishes a more powerful and flexible framework by drawing on highly scaleable, state–of–the–art technology.
Resumo:
Fundamental tooling is required in order to apply USDL in practical settings. This chapter discusses three fundamental types of tools for USDL. First, USDL editors have been developed for expert and casual users, respectively. Second, several USDL repositories have been built to allow editors accessing and storing USDL descriptions. Third, our generic USDL marketplace allows providers to describe their services once and potentially trade them anywhere. In addition, the iosyncrasies of service trading as opposed to the simpler case of product trading. The chapter also presents several deployment scenarios of such tools to foster individual value chains and support new business models across organizational boundaries. We close the chapter with an application of USDL in the context of service engineering.
Resumo:
The study of urban morphology has become an expanding field of research within the architectural discipline, providing theories to be used as tools in the understanding and design of urban landscapes from the past, the present and into the future. Drawing upon contemporary architectural design theory, this investigation reveals what a sectional analysis of an urban landscape can add to the existing research methods within this field. This paper conducts an enquiry into the use of the section as a tool for urban morphological analysis. Following the methodology of the British school of urban morphology, sections through the urban fabric of the case study city of Brisbane are compared. The results are categorised to depict changes in scale, components and utilisation throughout various timeframes. The key findings illustrate how the section, when read in conjunction with the plan can be used to interpret changes to urban form and the relationship that this has to the quality of the urban environment in the contemporary city.
Resumo:
The concept of Six Sigma was initiated in the 1980s by Motorola. Since then it has been implemented in several manufacturing and service organizations. Till now Six Sigma implementation is mostly limited to healthcare and financial services in private sector. Its implementation is now gradually picking up in services such as call center, education, construction and related engineering etc. in private as well as public sector. Through a literature review, a questionnaire survey, and multiple case study approach the paper develops a conceptual framework to facilitate widening the scope of Six Sigma implementation in service organizations. Using grounded theory methodology, this study develops theory for Six Sigma implementation in service organizations. The study involves a questionnaire survey and case studies to understand and build a conceptual framework. The survey was conducted in service organizations in Singapore and exploratory in nature. The case studies involved three service organizations which implemented Six Sigma. The objective is to explore and understand the issues highlighted by the survey and the literature. The findings confirm the inclusion of critical success factors, critical-to-quality characteristics, and set of tools and techniques as observed from the literature. In case of key performance indicator, there are different interpretations about it in literature and also by industry practitioners. Some literature explain key performance indicator as performance metrics whereas some feel it as key process input or output variables, which is similar to interpretations by practitioners of Six Sigma. The response of not relevant and unknown to us as reasons for not implementing Six Sigma shows the need for understanding specific requirements of service organizations. Though much theoretical description is available about Six Sigma, but there has been limited rigorous academic research on it. This gap is far more pronounced about Six Sigma implementation in service organizations, where the theory is not mature enough. Identifying this need, the study contributes by going through theory building exercise and developing a conceptual framework to understand the issues involving its implementation in service organizations.
Resumo:
The security of power transfer across a given transmission link is typically a steady state assessment. This paper develops tools to assess machine angle stability as affected by a combination of faults and uncertainty of wind power using probability analysis. The paper elaborates on the development of the theoretical assessment tool and demonstrates its efficacy using single machine infinite bus system.
Resumo:
Citizen Science projects are initiatives in which members of the general public participate in scientific research projects and perform or manage research-related tasks such as data collection and/or data annotation. Citizen Science is technologically possible and scientifically significant. However, as the gathered information is from the crowd, the data quality is always hard to manage. There are many ways to manage data quality, and reputation management is one of the common approaches. In recent year, many research teams have deployed many audio or image sensors in natural environment in order to monitor the status of animals or plants. The collected data will be analysed by ecologists. However, as the amount of collected data is exceedingly huge and the number of ecologists is very limited, it is impossible for scientists to manually analyse all these data. The functions of existing automated tools to process the data are still very limited and the results are still not very accurate. Therefore, researchers have turned to recruiting general citizens who are interested in helping scientific research to do the pre-processing tasks such as species tagging. Although research teams can save time and money by recruiting general citizens to volunteer their time and skills to help data analysis, the reliability of contributed data varies a lot. Therefore, this research aims to investigate techniques to enhance the reliability of data contributed by general citizens in scientific research projects especially for acoustic sensing projects. In particular, we aim to investigate how to use reputation management to enhance data reliability. Reputation systems have been used to solve the uncertainty and improve data quality in many marketing and E-Commerce domains. The commercial organizations which have chosen to embrace the reputation management and implement the technology have gained many benefits. Data quality issues are significant to the domain of Citizen Science due to the quantity and diversity of people and devices involved. However, research on reputation management in this area is relatively new. We therefore start our investigation by examining existing reputation systems in different domains. Then we design novel reputation management approaches for Citizen Science projects to categorise participants and data. We have investigated some critical elements which may influence data reliability in Citizen Science projects. These elements include personal information such as location and education and performance information such as the ability to recognise certain bird calls. The designed reputation framework is evaluated by a series of experiments involving many participants for collecting and interpreting data, in particular, environmental acoustic data. Our research in exploring the advantages of reputation management in Citizen Science (or crowdsourcing in general) will help increase awareness among organizations that are unacquainted with its potential benefits.
Resumo:
The use of Trusted Platform Module (TPM) is be- coming increasingly popular in many security sys- tems. To access objects protected by TPM (such as cryptographic keys), several cryptographic proto- cols, such as the Object Specific Authorization Pro- tocol (OSAP), can be used. Given the sensitivity and the importance of those objects protected by TPM, the security of this protocol is vital. Formal meth- ods allow a precise and complete analysis of crypto- graphic protocols such that their security properties can be asserted with high assurance. Unfortunately, formal verification of these protocols are limited, de- spite the abundance of formal tools that one can use. In this paper, we demonstrate the use of Coloured Petri Nets (CPN) - a type of formal technique, to formally model the OSAP. Using this model, we then verify the authentication property of this protocol us- ing the state space analysis technique. The results of analysis demonstrates that as reported by Chen and Ryan the authentication property of OSAP can be violated.
Resumo:
The state of the practice in safety has advanced rapidly in recent years with the emergence of new tools and processes for improving selection of the most cost-effective safety countermeasures. However, many challenges prevent fair and objective comparisons of countermeasures applied across safety disciplines (e.g. engineering, emergency services, and behavioral measures). These countermeasures operate at different spatial scales, are funded often by different financial sources and agencies, and have associated costs and benefits that are difficult to estimate. This research proposes a methodology by which both behavioral and engineering safety investments are considered and compared in a specific local context. The methodology involves a multi-stage process that enables the analyst to select countermeasures that yield high benefits to costs, are targeted for a particular project, and that may involve costs and benefits that accrue over varying spatial and temporal scales. The methodology is illustrated using a case study from the Geary Boulevard Corridor in San Francisco, California. The case study illustrates that: 1) The methodology enables the identification and assessment of a wide range of safety investment types at the project level; 2) The nature of crash histories lend themselves to the selection of both behavioral and engineering investments, requiring cooperation across agencies; and 3) The results of the cost-benefit analysis are highly sensitive to cost and benefit assumptions, and thus listing and justification of all assumptions is required. It is recommended that a sensitivity analyses be conducted when there is large uncertainty surrounding cost and benefit assumptions.
Resumo:
Flexible information exchange is critical to successful design-analysis integration, but current top-down, standards-based and model-oriented strategies impose restrictions that contradicts this flexibility. In this article we present a bottom-up, user-controlled and process-oriented approach to linking design and analysis applications that is more responsive to the varied needs of designers and design teams. Drawing on research into scientific workflows, we present a framework for integration that capitalises on advances in cloud computing to connect discrete tools via flexible and distributed process networks. We then discuss how a shared mapping process that is flexible and user friendly supports non-programmers in creating these custom connections. Adopting a services-oriented system architecture, we propose a web-based platform that enables data, semantics and models to be shared on the fly. We then discuss potential challenges and opportunities for its development as a flexible, visual, collaborative, scalable and open system.
Resumo:
Flexible information exchange is critical to successful design integration, but current top-down, standards-based and model-oriented strategies impose restrictions that are contradictory to this flexibility. In this paper we present a bottom-up, user-controlled and process-oriented approach to linking design and analysis applications that is more responsive to the varied needs of designers and design teams. Drawing on research into scientific workflows, we present a framework for integration that capitalises on advances in cloud computing to connect discrete tools via flexible and distributed process networks. Adopting a services-oriented system architecture, we propose a web-based platform that enables data, semantics and models to be shared on the fly. We discuss potential challenges and opportunities for the development thereof as a flexible, visual, collaborative, scalable and open system.
Resumo:
This paper demonstrates, following Vygotsky, that language and tool use has a critical role in the collaborative problem-solving behaviour of school-age children. It reports original ethnographic classroom research examining the convergence of speech and practical activity in children’s collaborative problem solving with robotics programming tasks. The researchers analysed children’s interactions during a series of problem solving experiments in which Lego Mindstorms toolsets were used by teachers to create robotics design challenges among 24 students in a Year 4 Australian classroom (students aged 8.5–9.5 years). The design challenges were incrementally difficult, beginning with basic programming of straight line movement, and progressing to more complex challenges involving programming of the robots to raise Lego figures from conduit pipes using robots as pulleys with string and recycled materials. Data collection involved micro-genetic analysis of students’ speech interactions with tools, peers, and other experts, teacher interviews, and student focus group data. Coding the repeated patterns in the transcripts, the authors outline the structure of the children’s social speech in joint problem solving, demonstrating the patterns of speech and interaction that play an important role in the socialisation of the school-age child’s practical intellect.