110 resultados para World wide web (www)
Resumo:
Complex networks have been studied extensively due to their relevance to many real-world systems such as the world-wide web, the internet, biological and social systems. During the past two decades, studies of such networks in different fields have produced many significant results concerning their structures, topological properties, and dynamics. Three well-known properties of complex networks are scale-free degree distribution, small-world effect and self-similarity. The search for additional meaningful properties and the relationships among these properties is an active area of current research. This thesis investigates a newer aspect of complex networks, namely their multifractality, which is an extension of the concept of selfsimilarity. The first part of the thesis aims to confirm that the study of properties of complex networks can be expanded to a wider field including more complex weighted networks. Those real networks that have been shown to possess the self-similarity property in the existing literature are all unweighted networks. We use the proteinprotein interaction (PPI) networks as a key example to show that their weighted networks inherit the self-similarity from the original unweighted networks. Firstly, we confirm that the random sequential box-covering algorithm is an effective tool to compute the fractal dimension of complex networks. This is demonstrated on the Homo sapiens and E. coli PPI networks as well as their skeletons. Our results verify that the fractal dimension of the skeleton is smaller than that of the original network due to the shortest distance between nodes is larger in the skeleton, hence for a fixed box-size more boxes will be needed to cover the skeleton. Then we adopt the iterative scoring method to generate weighted PPI networks of five species, namely Homo sapiens, E. coli, yeast, C. elegans and Arabidopsis Thaliana. By using the random sequential box-covering algorithm, we calculate the fractal dimensions for both the original unweighted PPI networks and the generated weighted networks. The results show that self-similarity is still present in generated weighted PPI networks. This implication will be useful for our treatment of the networks in the third part of the thesis. The second part of the thesis aims to explore the multifractal behavior of different complex networks. Fractals such as the Cantor set, the Koch curve and the Sierspinski gasket are homogeneous since these fractals consist of a geometrical figure which repeats on an ever-reduced scale. Fractal analysis is a useful method for their study. However, real-world fractals are not homogeneous; there is rarely an identical motif repeated on all scales. Their singularity may vary on different subsets; implying that these objects are multifractal. Multifractal analysis is a useful way to systematically characterize the spatial heterogeneity of both theoretical and experimental fractal patterns. However, the tools for multifractal analysis of objects in Euclidean space are not suitable for complex networks. In this thesis, we propose a new box covering algorithm for multifractal analysis of complex networks. This algorithm is demonstrated in the computation of the generalized fractal dimensions of some theoretical networks, namely scale-free networks, small-world networks, random networks, and a kind of real networks, namely PPI networks of different species. Our main finding is the existence of multifractality in scale-free networks and PPI networks, while the multifractal behaviour is not confirmed for small-world networks and random networks. As another application, we generate gene interactions networks for patients and healthy people using the correlation coefficients between microarrays of different genes. Our results confirm the existence of multifractality in gene interactions networks. This multifractal analysis then provides a potentially useful tool for gene clustering and identification. The third part of the thesis aims to investigate the topological properties of networks constructed from time series. Characterizing complicated dynamics from time series is a fundamental problem of continuing interest in a wide variety of fields. Recent works indicate that complex network theory can be a powerful tool to analyse time series. Many existing methods for transforming time series into complex networks share a common feature: they define the connectivity of a complex network by the mutual proximity of different parts (e.g., individual states, state vectors, or cycles) of a single trajectory. In this thesis, we propose a new method to construct networks of time series: we define nodes by vectors of a certain length in the time series, and weight of edges between any two nodes by the Euclidean distance between the corresponding two vectors. We apply this method to build networks for fractional Brownian motions, whose long-range dependence is characterised by their Hurst exponent. We verify the validity of this method by showing that time series with stronger correlation, hence larger Hurst exponent, tend to have smaller fractal dimension, hence smoother sample paths. We then construct networks via the technique of horizontal visibility graph (HVG), which has been widely used recently. We confirm a known linear relationship between the Hurst exponent of fractional Brownian motion and the fractal dimension of the corresponding HVG network. In the first application, we apply our newly developed box-covering algorithm to calculate the generalized fractal dimensions of the HVG networks of fractional Brownian motions as well as those for binomial cascades and five bacterial genomes. The results confirm the monoscaling of fractional Brownian motion and the multifractality of the rest. As an additional application, we discuss the resilience of networks constructed from time series via two different approaches: visibility graph and horizontal visibility graph. Our finding is that the degree distribution of VG networks of fractional Brownian motions is scale-free (i.e., having a power law) meaning that one needs to destroy a large percentage of nodes before the network collapses into isolated parts; while for HVG networks of fractional Brownian motions, the degree distribution has exponential tails, implying that HVG networks would not survive the same kind of attack.
Resumo:
Information Technology (IT) education is in crisis. Enrolments have dropped by up to as much as 70% at some universities (Markoff, 2009). This coupled with traditionally high attrition and failure rates (Biggers et al, 2008) is resulting in the number of graduates nationwide being far lower than industry demand (Queensland Government SkillsInfo Report, 2009). This work reports on a radical redesign of the Bachelor of IT degree at QUT. The initial results are very promising with attrition in first year dropping from being one of the highest at QUT for an undergraduate degree to being one of the lowest. The redesign followed an action research model to reflect on issues and problems with the previous version of the degree and to introduce changes to attempt to rectify some of these problems. The resulting degree intends to produce "business savvy" graduates who are capable of using their IT knowledge and skills within cross-functional teams to solve complex problems.
Resumo:
Process modelling – the design and use of graphical documentations of an organisation’s business processes – is a key method to document and use information about business processes in organisational projects. Still, despite current interest in process modelling, this area of study still faces essential challenges. One of the key unanswered questions concerns the impact of process modelling in organisational practice. Process modelling initiatives call for tangible results in the form of returns on the substantial investments that organisations undertake to achieve improved processes. This study explores the impact of process model use on end-users and its contribution to organisational success. We posit that the use of conceptual models creates impact in organisational process teams. We also report on a set of case studies in which we explore tentative evidence for the development of impact of process model use. The results of this work provide a better understanding of process modelling impact from information practices and also lead to insights into how organisations should conduct process modelling initiatives in order to achieve an optimum return on their investment.
Resumo:
Tacit knowledge sharing amongst physicians, such as the sharing of clinical experiences, skills, or know-how, or know-whom, is known to have a significant impact on the quality of medical diagnosis and decisions. This paper posits that social media can provide new opportunities for tacit knowledge sharing amongst physicians, and demonstrates this by presenting findings from a review of relevant literature and a survey conducted with physicians. Semi-structured interviews were conducted with ten physicians from around the world who were active users of social media. Initial thematic analysis revealed eight themes as potential contributions of social web tools to facilitate tacit knowledge flow amongst physicians. The emergent themes are defined, linked to the literature, and supported by instances of interview transcripts. Findings presented here are preliminary, and final results will be reported after accomplishing all phases of data collection and analysis.
Resumo:
The World Wide Web constitutes one of the most important inventions of the late 20th century: it has changed culture, society, business, communication, politics, and many other fields of human endeavour, not least also by providing a more user-friendly pathway of access to its major underlying technology, the Internet itself. Key phases in its development can be charted, especially by how it has been used to present and share information – and here, the personal or professional, private or official homepage stands in as a useful representation of wider Web trends overall. From hand-coded beginnings through several successive stages of experimentation and standardisation, to the shifting balance between personal sites and social networks, the homepage demonstrates how the Web itself, and its place in our lives, have changed.
Resumo:
Technological growth in the 21st century is exponential. Simultaneously, development of the associated risk, uncertainty and user acceptance are scattered. This required appropriate study to establish people accepting controversial technology (PACT). The Internet and services around it, such as World Wide Web, e-mail, instant messaging and social networking are increasingly becoming important in many aspects of our lives. Information related to medical and personal health sharing using the Internet is controversial and demand validity, usability and acceptance. Whilst literature suggest, Internet enhances patients and physicians’ positive interactions some studies establish opposite of such interaction in particular the associated risk. In recent years Internet has attracted considerable attention as a means to improve health and health care delivery. However, it is not clear how widespread the use of Internet for health care really is or what impact it has on health care utilisation. Estimated impact of Internet usage varies widely from the locations locally and globally. As a result, an estimate (or predication) of Internet use and their effects in Medical Informatics related decision-making is impractical. This open up research issues on validating and accepting Internet usage when designing and developing appropriate policy and processes activities for Medical Informatics, Health Informatics and/or e-Health related protocols. Access and/or availability of data on Internet usage for Medical Informatics related activities are unfeasible. This paper presents a trend analysis of the growth of Internet usage in medical informatics related activities. In order to perform the analysis, data was extracted from ERA (Excellence Research in Australia) ranked “A” and “A*” Journal publications and reports from the authenticated public domain. The study is limited to the analyses of Internet usage trends in United States, Italy, France and Japan. Projected trends and their influence to the field of medical informatics is reviewed and discussed. The study clearly indicates a trend of patients becoming active consumers of health information rather than passive recipients.
Resumo:
In relation to enterprise technology governance (ETG), opinions differ between there being no need for board of director involvement to there being an urgent need for such involvement. This research highlights the need for boards to provide ETG oversight of technology-related strategy, investment and risk, and to be competent in doing so. We identify a large gap between board’s awareness of the importance of ETG, their taking action and the competency requirements for effective ETG. Further, while there is considerable research and literature about operational IT governance frameworks and operational IT competencies, there is no known research into the specific competencies boards of directors need to effectively govern enterprise technology. This research focuses on and develops a board-level ETG competency set using a mixed methods approach within a recognised competency development framework. Further development is tracked using a rigour scale to demonstrate a medium to high level of competency validity for the derived set. This research contributes to practice by providing the first known industry validated ETG competency set situated within new and emerging technology. It contributes to the body of knowledge in the modification and application of competency development and competency validation frameworks not previously applied to the role of board director.
Resumo:
A fear of imminent information overload predates the World Wide Web by decades. Yet, that fear has never abated. Worse, as the World Wide Web today takes the lion’s share of the information we deal with, both in amount and in time spent gathering it, the situation has only become more precarious. This chapter analyses new issues in information overload that have emerged with the advent of the Web, which emphasizes written communication, defined in this context as the exchange of ideas expressed informally, often casually, as in verbal language. The chapter focuses on three ways to mitigate these issues. First, it helps us, the users, to be more specific in what we ask for. Second, it helps us amend our request when we don't get what we think we asked for. And third, since only we, the human users, can judge whether the information received is what we want, it makes retrieval techniques more effective by basing them on how humans structure information. This chapter reports on extensive experiments we conducted in all three areas. First, to let users be more specific in describing an information need, they were allowed to express themselves in an unrestricted conversational style. This way, they could convey their information need as if they were talking to a fellow human instead of using the two or three words typically supplied to a search engine. Second, users were provided with effective ways to zoom in on the desired information once potentially relevant information became available. Third, a variety of experiments focused on the search engine itself as the mediator between request and delivery of information. All examples that are explained in detail have actually been implemented. The results of our experiments demonstrate how a human-centered approach can reduce information overload in an area that grows in importance with each day that passes. By actually having built these applications, I present an operational, not just aspirational approach.
Resumo:
A people-to-people matching system (or a match-making system) refers to a system in which users join with the objective of meeting other users with the common need. Some real-world examples of these systems are employer-employee (in job search networks), mentor-student (in university social networks), consume-to-consumer (in marketplaces) and male-female (in an online dating network). The network underlying in these systems consists of two groups of users, and the relationships between users need to be captured for developing an efficient match-making system. Most of the existing studies utilize information either about each of the users in isolation or their interaction separately, and develop recommender systems using the one form of information only. It is imperative to understand the linkages among the users in the network and use them in developing a match-making system. This study utilizes several social network analysis methods such as graph theory, small world phenomenon, centrality analysis, density analysis to gain insight into the entities and their relationships present in this network. This paper also proposes a new type of graph called “attributed bipartite graph”. By using these analyses and the proposed type of graph, an efficient hybrid recommender system is developed which generates recommendation for new users as well as shows improvement in accuracy over the baseline methods.
Resumo:
This research investigates the extent to which the World Wide Web and the participatory news media culture have contributed to the democratisation of journalism since 1997. It examined the different ways in which public service and commercial news media models use digital platforms to fulfil their obligations as members of the Fourth Estate. The research found that the digital environment provides news organisations with greater scope for transparency, interactivity, collaboration and social networking compared to the traditional print and broadcast platforms.
Resumo:
Business process modelling as a practice and research field has received great attention over recent years. Organizations invest significantly into process modelling in terms of training, tools, capabilities and resources. The return on this investment is a function of process model re-use, which we define as the recurring use of process models to support organizational work tasks. While prior research has examined re-use as a design principle, we explore re-use as a behaviour, because evidence suggest that analysts’ re-use of process models is indeed limited. In this paper we develop a two-stage conceptualization of the key object-, behaviour- and socioorganization-centric factors explaining process model re-use behaviour. We propose a theoretical model and detail implications for its operationalization and measurement. Our study can provide significant benefits to our understanding of process modelling and process model use as key practices in analysis and design.
Resumo:
Enterprise Social Networks continue to be adopted by organisations looking to increase collaboration between employees, customers and industry partners. Offering a varied range of features and functionality, this technology can be distinguished by the underlying business models that providers of this software deploy. This study identifies and describes the different business models through an analysis of leading Enterprise Social Networks: Yammer, Chatter, SharePoint, Connections, Jive, Facebook and Twitter. A key contribution of this research is the identification of consumer and corporate models as extreme approaches. These findings align well with research on the adoption of Enterprise Social Networks that has discussed bottom-up and top-down approaches. Of specific interest are hybrid models that wrap a corporate model within a consumer model and may, therefore, provide synergies on both models. From a broader perspective, this can be seen as the merging of the corporate and consumer markets for IT products and services.
Resumo:
Software as a Service (SaaS) is anticipated to provide significant benefits to small and medium enterprises (SMEs) due to ease of access to high-end applications, 7*24 availability, utility pricing, etc. However, underlying SaaS is the assumption that SMEs will directly interact with the SaaS vendor and use a self-service model. In practice, we see the rise of SaaS intermediaries who support SMEs with using SaaS. This paper reports on an empirical study of the role of intermediaries in terms of how they support SMEs in sourcing and leveraging SaaS for their business. The knowledge contributions of this paper are: (1) the identification and description of the role of SaaS intermediaries and (2) the specification of different roles of SaaS intermediaries, in particular a more basic role with technology orientation and operational alignment perspective and (3) a more added value role with customer orientation and strategic alignment perspective.
Resumo:
This paper investigates how Enterprise Architecture (EA) evolves due to emerging trends. It specifically explores how EA integrates the Service-oriented Architecture (SOA). Archer’s Morphogenetic theory is used as an analytical approach to distinguish the architectural conditions under which SOA is introduced, to study the relationships between these conditions and SOA introduction, and to reflect on EA evolution (elaborations) that then take place. The paper focuses on reasons for why EA evolution could take place, or not and what architectural changes could happen due to SOA integration. The research builds on sound theoretical foundations to discuss EA evolution in a field that often lacks a solid theoretical groundwork. Specifically, it proposes that critical realism, using the morphogenetic theory, can provide a useful theoretical foundation to study enterprise architecture (EA) evolution. The initial results of a literature review (a-priori model) were extended using explorative interviews. The findings of this study are threefold. First, there are five different levels of EA-SOA integration outcomes. Second, a mature EA, flexible and well-defined EA framework and comprehensive objectives of EA improve the integration outcomes. Third, the analytical separation using Archer’s theory is helpful in order to understand how these different integration outcomes are generated.
Resumo:
This paper proposes that critical realism can provide a useful theoretical foundation to study enterprise architecture (EA) evolution. Specifically it will investigate the practically relevant and academically challenging question of how EAs integrate the Service-oriented Architecture (SOA). Archer’s Morphogenetic theory is used as an analytical approach to distinguish the architectural conditions under which SOA is introduced, to study the relationships between these conditions and SOA introduction, and to reflect on EA evolution (elaborations) that then take place. The focus lies on the reasons why EA evolution takes place (or not) and what architectural changes happen. This paper uses the findings of a literature review to build an a-priori model informed by Archer’s theory to understand EA evolution in a field that often lacks a solid theoretical groundwork. The findings are threefold. First, EA can evolve on different levels (different integration outcomes). Second, the integration outcomes are classified into three levels: business architecture, information systems architecture and technology architecture. Third, the analytical separation using Archer’s theory is helpful in order to understand how these different integration outcomes are generated.