841 resultados para Computer aided network analysis
Resumo:
This paper outlines a method for studying online activity using both qualitative and quantitative methods: topical network analysis. A topical network refers to "the collection of sites commenting on a particular event or issue, and the links between them" (Highfield, Kirchhoff, & Nicolai, 2011, p. 341). The approach is a complement for the analysis of large datasets enabling the examination and comparison of different discussions as a means of improving our understanding of the uses of social media and other forms of online communication. Developed for an analysis of political blogging, the method also has wider applications for other social media websites such as Twitter.
Resumo:
What psychological function does brand loyalty serve? Drawing on Katz’s (1960) Functional Theory of Attitudes, we propose that there are four functions (or motivational antecedents) of loyalty: utilitarian, knowledge, value-expressive and ego-defensive. We discuss how each function relates to the three dimensions of loyalty (i.e. emotional, cognitive, and behavioural loyalty). Then this conceptualisation of brand loyalty is explored using four consumer focus groups. These exploratory results demonstrate that the application of a functional approach to brand loyalty yields insights which have not been apparent in previous research. More specifically, this paper notes insights in relation to brand loyalty from a consumer’s perspective, including the notion that the ego-defensive function is an orientation around what others think and feel. This creates the possibilities for future research into brand loyalty via social network analysis, in order to better understand how the thoughts of others affect consumers’ loyalty attributes. --------------------------------------------------------------------------------
Resumo:
Ocean gliders constitute an important advance in the highly demanding ocean monitoring scenario. Their effciency, endurance and increasing robustness make these vehicles an ideal observing platform for many long term oceanographic applications. However, they have proved to be also useful in the opportunis-tic short term characterization of dynamic structures. Among these, mesoscale eddies are of particular interest due to the relevance they have in many oceano-graphic processes.
Resumo:
BACKGROUND There is increasing enrolment of international students in the Engineering and Information Technology disciplines and anecdotal evidence of a need for additional understanding and support for these students and their supervisors due to differences both in academic and social cultures. While there is a growing literature on supervisory styles and guidelines on effective supervision, there is little on discipline-specific, cross-cultural supervision responding to the growing diversity. In this paper, we report findings from a study of Engineering and Information technology Higher Degree Research (HDR)students and supervision in three Australian universities. PURPOSE The aim was to assess perceptions of students and supervisors of factors influencing success that are particular to international or culturally and linguistically diverse (CaLD) HDR students in Engineering and Information technology. DESIGN/METHOD Online survey and qualitative data was collected from international and CaLD HDR students and supervisors at the three universities. Bayesian network analysis, inferential statistics, and qualitative analysis provided the main findings. RESULTS Survey results indicate that both students and supervisors are positive about their experiences, and do not see language or culture as particularly problematic. The survey results also reveal strong consistency between the perceptions of students and supervisors on most factors influencing success. Qualitative analysis of critical supervision incidents has provided rich data that could help improve support services. CONCLUSIONS In contrast with anecdotal evidence, HDR completion data from the three universities reveal that international students, on average, complete in shorter time periods than domestic students. The analysis suggests that success is linked to a complex set of factors involving the student, supervision, the institution and broader community.
Resumo:
An evolution in the use of digital modelling has occurred in the Queensland Department of Public Works Division of Project Services over the last 20 years from: the initial implementation of computer aided design and documentation (CADD); to experimentation with building information modelling (BIM); to embedding integrated practice (IP); to current steps towards integrated project delivery (IPD) including the active involvement of consultants and contractors in the design/delivery process. This case study is one of three undertaken through the Australian Sustainable Built Environment National Research Centre investigating past R&D investment. The intent of these cases is to inform the development of policy guidelines for future investment in the construction industry in Australia. This research is informing the activities of CIB Task Group 85 R&D Investment and Impact. The uptake of digital modelling by Project Services has been approached through an incremental learning approach. This has been driven by a strong and clear vision with a focus on developing more efficient delivery mechanisms through the use of new technology coupled with process change. Findings reveal an organisational focus on several areas including: (i) strategic decision making including the empowerment of innovation leaders and champions; (ii) the acquisition and exploitation of knowledge; (iii) product and process development (with a focus on efficiency and productivity); (iv) organisational learning; (v) maximising the use of technology; and (vi) supply chain integration. Key elements of this approach include pilot projects, researcher engagement, industry partnerships and leadership.
Resumo:
Using Monte Carlo simulation for radiotherapy dose calculation can provide more accurate results when compared to the analytical methods usually found in modern treatment planning systems, especially in regions with a high degree of inhomogeneity. These more accurate results acquired using Monte Carlo simulation however, often require orders of magnitude more calculation time so as to attain high precision, thereby reducing its utility within the clinical environment. This work aims to improve the utility of Monte Carlo simulation within the clinical environment by developing techniques which enable faster Monte Carlo simulation of radiotherapy geometries. This is achieved principally through the use new high performance computing environments and simpler alternative, yet equivalent representations of complex geometries. Firstly the use of cloud computing technology and it application to radiotherapy dose calculation is demonstrated. As with other super-computer like environments, the time to complete a simulation decreases as 1=n with increasing n cloud based computers performing the calculation in parallel. Unlike traditional super computer infrastructure however, there is no initial outlay of cost, only modest ongoing usage fees; the simulations described in the following are performed using this cloud computing technology. The definition of geometry within the chosen Monte Carlo simulation environment - Geometry & Tracking 4 (GEANT4) in this case - is also addressed in this work. At the simulation implementation level, a new computer aided design interface is presented for use with GEANT4 enabling direct coupling between manufactured parts and their equivalent in the simulation environment, which is of particular importance when defining linear accelerator treatment head geometry. Further, a new technique for navigating tessellated or meshed geometries is described, allowing for up to 3 orders of magnitude performance improvement with the use of tetrahedral meshes in place of complex triangular surface meshes. The technique has application in the definition of both mechanical parts in a geometry as well as patient geometry. Static patient CT datasets like those found in typical radiotherapy treatment plans are often very large and present a significant performance penalty on a Monte Carlo simulation. By extracting the regions of interest in a radiotherapy treatment plan, and representing them in a mesh based form similar to those used in computer aided design, the above mentioned optimisation techniques can be used so as to reduce the time required to navigation the patient geometry in the simulation environment. Results presented in this work show that these equivalent yet much simplified patient geometry representations enable significant performance improvements over simulations that consider raw CT datasets alone. Furthermore, this mesh based representation allows for direct manipulation of the geometry enabling motion augmentation for time dependant dose calculation for example. Finally, an experimental dosimetry technique is described which allows the validation of time dependant Monte Carlo simulation, like the ones made possible by the afore mentioned patient geometry definition. A bespoke organic plastic scintillator dose rate meter is embedded in a gel dosimeter thereby enabling simultaneous 3D dose distribution and dose rate measurement. This work demonstrates the effectiveness of applying alternative and equivalent geometry definitions to complex geometries for the purposes of Monte Carlo simulation performance improvement. Additionally, these alternative geometry definitions allow for manipulations to be performed on otherwise static and rigid geometry.
Resumo:
This article reports on the design and implementation of a computer-aided sheet nesting system (CASNS) for the nesting of two-dimensional irregular-shaped sheet-metal blanks on a given sheet stock or coil stock. The system is designed by considering several constraints of sheet-metal stamping operations, such as bridge width and grain orientation, and design requirements such as maximizing the strength of the part hen subsequent bending is involved, minimization of scrap, and economic justification for'a single or multiple station operation. Through many practical case studies, the system proves its efficiency, effectiveness and usefulness.
Resumo:
This article reports on the design and implementation of a Computer-Aided Die Design System (CADDS) for sheet-metal blanks. The system is designed by considering several factors, such as the complexity of blank geometry, reduction in scrap material, production requirements, availability of press equipment and standard parts, punch profile complexity, and tool elements manufacturing method. The interaction among these parameters and how they affect designers' decision patterns is described. The system is implemented by interfacing AutoCAD with the higher level languages FORTRAN 77 and AutoLISP. A database of standard die elements is created by parametric programming, which is an enhanced feature of AutoCAD. The greatest advantage achieved by the system is the rapid generation of the most efficient strip and die layouts, including information about the tool configuration.
Resumo:
Parametric and generative modelling methods are ways in which computer models are made more flexible, and of formalising domain-specific knowledge. At present, no open standard exists for the interchange of parametric and generative information. The Industry Foundation Classes (IFC) which are an open standard for interoperability in building information models is presented as the base for an open standard in parametric modelling. The advantage of allowing parametric and generative representations are that the early design process can allow for more iteration and changes can be implemented quicker than with traditional models. This paper begins with a formal definition of what constitutes to be parametric and generative modelling methods and then proceeds to describe an open standard in which the interchange of components could be implemented. As an illustrative example of generative design, Frazer’s ‘Reptiles’ project from 1968 is reinterpreted.
Resumo:
Skin is the largest, and arguably, the most important organ of the body. It is a complex and multi-dimensional tissue, thus making it essentially impossible to fully model in vitro in conventional 2-dimensional culture systems. In view of this, rodents or pigs are utilised to study wound healing therapeutics or to investigate the biological effects of treatments on skin. However, there are many differences between the wound healing processes in rodents compared to humans (contraction vs. re-epithelialisation) and there are also ethical issues associated with animal testing for scientific research. Therefore, the development of skin equivalent (HSE) models from surgical discard human skin has become an important area of research. The studies in this thesis compare, for the first time, native human skin and the epidermogenesis process in a HSE model. The HSE was reported to be a comparable model for human skin in terms of expression and localisation of key epidermal cell markers. This validated HSE model was utilised to study the potential wound healing therapeutic, hyperbaric oxygen (HBO) therapy. There is a significant body of evidence suggesting that lack of cutaneous oxygen results in and potentiates the chronic, non-healing wound environment. Although the evidence is anecdotal, HBO therapy has displayed positive effects on re-oxygenation of chronic wounds and the clinical outcomes suggest that HBO treatment may be beneficial. Therefore, the HSE was subjected to a daily clinical HBO regime and assessed in terms of keratinocyte migration, proliferation, differentiation and epidermal thickening. HBO treatment was observed to increase epidermal thickness, in particular stratum corneum thickening, but it did not alter the expression or localisation of standard epidermal cell markers. In order to elucidate the mechanistic changes occurring in response to HBO treatment in the HSE model, gene microarrays were performed, followed by qRT-PCR of select genes which were differentially regulated in response to HBO treatment. The biological diversity of the HSEs created from individual skin donors, however, overrode the differences in gene expression between treatment groups. Network analysis of functional changes in the HSE model revealed general trends consistent with normal skin growth and maturation. As a more robust and longer term study of these molecular changes, protein localisation and expression was investigated in sections from the HSEs undergoing epidermogenesis in response to HBO treatment. These proteins were CDCP1, Metallothionein, Kallikrein (KLK) 1 and KLK7 and early growth response 1. While the protein expression within the HSE models exposed to HBO treatment were not consistent in all HSEs derived from all skin donors, this is the first study to detect and compare both KLK1 and CDCP1 protein expression in both a HSE model and native human skin. Furthermore, this is the first study to provide such an in depth analysis of the effect of HBO treatment on a HSE model. The data presented in this thesis, demonstrates high levels of variation between individuals and their response to HBO treatment, consistent with the clinical variation that is currently observed.
Resumo:
As buildings have become more advanced and complex, our ability to understand how they are operated and managed has diminished. Modern technologies have given us systems to look after us but it appears to have taken away our say in how we like our environment to be managed. The aim of this paper is to discuss our research concerning spaces that are sensitive to changing needs and allow building-users to have a certain level of freedom to understand and control their environment. We discuss why, what we call the Active Layer, is needed in modern buildings; how building inhabitants are to interact with it; and the development of interface prototypes to test consequences of having the Active Layer in our environment.
Resumo:
Collaboration between faculty and librarians is an important topic of discussion and research among academic librarians. These partnerships between faculty and librarians are vital for enabling students to become lifelong learners through their information literacy education. This research developed an understanding of academic collaborators by analyzing a community college faculty's teaching social networks. A teaching social network, an original term generated in this study, is comprised of communications that influence faculty when they design and deliver their courses. The communication may be formal (e.g., through scholarly journals and professional development activities) and informal (e.g., through personal communication) through their network elements. Examples of the elements of a teaching social network may be department faculty, administration, librarians, professional development, and students. This research asked 'What is the nature of faculty's teaching social networks and what are the implications for librarians?' This study moves forward the existing research on collaboration, information literacy, and social network analysis. It provides both faculty and librarians with added insight into their existing and potential relationships. This research was undertaken using mixed methods. Social network analysis was the quantitative data collection methodology and the interview method was the qualitative technique. For the social network analysis data, a survey was sent to full-time faculty at Las Positas College, a community college, in California. The survey gathered the data and described the teaching social networks for faculty with respect to their teaching methods and content taught. Semi-structured interviews were conducted following the survey with a sub-set of survey respondents to understand why specific elements were included in their teaching social networks and to learn of ways for librarians to become an integral part of the teaching social networks. The majority of the faculty respondents were moderately influenced by the elements of their network except the majority of the potentials were weakly influenced by the elements in their network in their content taught. The elements with the most influence on both teaching methods and content taught were students, department faculty, professional development, and former graduate professors and coursework. The elements with the least influence on both aspects were public or academic librarians, and social media. The most popular roles for the elements were conversations about teaching, sharing ideas, tips for teaching, insights into teaching, suggestions for ways of teaching, and how to engage students. Librarians' weakly influenced faculty in their teaching methods and their content taught. The motivating factors for collaboration with librarians were that students learned how to research, students' research projects improved, faculty saved time by having librarians provide the instruction to students, and faculty built strong working relationships with librarians. The challenges of collaborating with librarians were inadequate teaching techniques used when librarians taught research orientations and lack of time. Ways librarians can be more integral in faculty's teaching social networks included: more workshops for faculty, more proactive interaction with faculty, and more one-on-one training sessions for faculty. Some of the recommendations for the librarians from this study were develop a strong rapport with faculty, librarians should build their services in information literacy from the point of view of the faculty instead of from the librarian perspective, use staff development funding to attend conferences and workshops to improve their teaching, develop more training sessions for faculty, increase marketing efforts of the librarian's instructional services, and seek grant opportunities to increase funding for the library. In addition, librarians and faculty should review the definitions of information literacy and move from a skills based interpretation to a learning process.
Resumo:
Situation awareness, ones understanding of ‘what is going on’, is a critical commodity for road users. Although the concept has received much attention in the driving context, situation awareness in vulnerable road users, such as cyclists, remains unexplored. This paper presents the findings from an exploratory on-road study of cyclist situation awareness, the aim of which was to explore how cyclists develop situation awareness, what their situation awareness comprises, and what the causes of degraded cyclist situation awareness may be. Twenty participants cycled a pre-defined urban on-road study route. A range of data were collected, including verbal protocols, forward scene video and rear video, and a network analysis procedure was used to describe and assess cyclist situation awareness. The analysis produced a number of key findings regarding cyclist situation awareness, including the potential for cyclists’ awareness of other road users to be degraded due to additional situation awareness and decision making requirements that are placed on them in certain road situations. Strategies for improving cyclists’ situation awareness are discussed.
Resumo:
In Victoria, as in other jurisdictions, there is very little research on the potential risks and benefits of lane filtering by motorcyclists, particularly from a road safety perspective. This on-road proof of concept study aimed to investigate whether and how lane filtering influences motorcycle rider situation awareness at intersections and to address factors that need to be considered for the design of a larger study in this area. Situation awareness refers to road users’ understanding of ‘what is going on’ around them and is a critical commodity for safe performance. Twenty-five experienced motorcyclists rode their own instrumented motorcycle around an urban test route in Melbourne whilst providing verbal protocols. Lane filtering occurred in 27% of 43 possible instances in which there were one or more vehicles in the traffic queue and the traffic lights were red on approach to the intersection. A network analysis procedure, based on the verbal protocols provided by motorcyclists, was used to identify differences in motorcyclist situation awareness between filtering and non-filtering events. Although similarities in situation awareness across filtering and nonfiltering motorcyclists were found, the analysis revealed some differences. For example, filtering motorcyclists placed more emphasis on the timing of the traffic light sequence and on their own actions when moving to the front of the traffic queue, whilst non-filtering motorcyclists paid greater attention to traffic moving through the intersection and approaching from behind. Based on the results of this study, the paper discusses some methodological and theoretical issues to be addressed in a larger study comparing situation awareness between filtering and non-filtering motorcyclists.
Resumo:
Whether by using electronic banking, by using credit cards, or by synchronising a mobile telephone via Bluetooth to an in-car system, humans are a critical part in many cryptographic protocols daily. We reduced the gap that exists between the theory and the reality of the security of these cryptographic protocols involving humans, by creating tools and techniques for proofs and implementations of human-followable security. After three human research studies, we present a model for capturing human recognition; we provide a tool for generating values called Computer-HUman Recognisable Nonces (CHURNs); and we provide a model for capturing human perceptible freshness.