910 resultados para Digital Manufacturing, Digital Mock Up, Simulation Intent
Resumo:
Road surface macro-texture is an indicator used to determine the skid resistance levels in pavements. Existing methods of quantifying macro-texture include the sand patch test and the laser profilometer. These methods utilise the 3D information of the pavement surface to extract the average texture depth. Recently, interest in image processing techniques as a quantifier of macro-texture has arisen, mainly using the Fast Fourier Transform (FFT). This paper reviews the FFT method, and then proposes two new methods, one using the autocorrelation function and the other using wavelets. The methods are tested on pictures obtained from a pavement surface extending more than 2km's. About 200 images were acquired from the surface at approx. 10m intervals from a height 80cm above ground. The results obtained from image analysis methods using the FFT, the autocorrelation function and wavelets are compared with sensor measured texture depth (SMTD) data obtained from the same paved surface. The results indicate that coefficients of determination (R2) exceeding 0.8 are obtained when up to 10% of outliers are removed.
Resumo:
The Dark Ages are generally held to be a time of technological and intellectual stagnation in western development. But that is not necessarily the case. Indeed, from a certain perspective, nothing could be further from the truth. In this paper we draw historical comparisons, focusing especially on the thirteenth and fourteenth centuries, between the technological and intellectual ruptures in Europe during the Dark Ages, and those of our current period. Our analysis is framed in part by Harold Innis’s2 notion of "knowledge monopolies". We give an overview of how these were affected by new media, new power struggles, and new intellectual debates that emerged in thirteenth and fourteenth century Europe. The historical salience of our focus may seem elusive. Our world has changed so much, and history seems to be an increasingly far-from-favoured method for understanding our own period and its future potentials. Yet our seemingly distant historical focus provides some surprising insights into the social dynamics that are at work today: the fracturing of established knowledge and power bases; the democratisation of certain "sacred" forms of communication and knowledge, and, conversely, the "sacrosanct" appropriation of certain vernacular forms; challenges and innovations in social and scientific method and thought; the emergence of social world-shattering media practices; struggles over control of vast networks of media and knowledge monopolies; and the enclosure of public discursive and social spaces for singular, manipulative purposes. The period between the eleventh and fourteenth centuries in Europe prefigured what we now call the Enlightenment, perhaps moreso than any other period before or after; it shaped what the Enlightenment was to become. We claim no knowledge of the future here. But in the "post-everything" society, where history is as much up for sale as it is for argument, we argue that our historical perspective provides a useful analogy for grasping the wider trends in the political economy of media, and for recognising clear and actual threats to the future of the public sphere in supposedly democratic societies.
Resumo:
The Dark Ages are generally held to be a time of technological and intellectual stagnation in western development. But that is not necessarily the case. Indeed, from a certain perspective, nothing could be further from the truth. In this paper we draw historical comparisons, focusing especially on the thirteenth and fourteenth centuries, between the technological and intellectual ruptures in Europe during the Dark Ages, and those of our current period. Our analysis is framed in part by Harold Innis’s2 notion of "knowledge monopolies". We give an overview of how these were affected by new media, new power struggles, and new intellectual debates that emerged in thirteenth and fourteenth century Europe. The historical salience of our focus may seem elusive. Our world has changed so much, and history seems to be an increasingly far-from-favoured method for understanding our own period and its future potentials. Yet our seemingly distant historical focus provides some surprising insights into the social dynamics that are at work today: the fracturing of established knowledge and power bases; the democratisation of certain "sacred" forms of communication and knowledge, and, conversely, the "sacrosanct" appropriation of certain vernacular forms; challenges and innovations in social and scientific method and thought; the emergence of social world-shattering media practices; struggles over control of vast networks of media and knowledge monopolies; and the enclosure of public discursive and social spaces for singular, manipulative purposes. The period between the eleventh and fourteenth centuries in Europe prefigured what we now call the Enlightenment, perhaps moreso than any other period before or after; it shaped what the Enlightenment was to become. We claim no knowledge of the future here. But in the "post-everything" society, where history is as much up for sale as it is for argument, we argue that our historical perspective provides a useful analogy for grasping the wider trends in the political economy of media, and for recognising clear and actual threats to the future of the public sphere in supposedly democratic societies.
Resumo:
The Queensland University of Technology (QUT) allows the presentation of a thesis for the Degree of Doctor of Philosophy in the format of published or submitted papers, where such papers have been published, accepted or submitted during the period of candidature. This thesis is composed of seven published/submitted papers, of which one has been published, three accepted for publication and the other three are under review. This project is financially supported by an Australian Research Council (ARC) Discovery Grant with the aim of proposing strategies for the performance control of Distributed Generation (DG) system with digital estimation of power system signal parameters. Distributed Generation (DG) has been recently introduced as a new concept for the generation of power and the enhancement of conventionally produced electricity. Global warming issue calls for renewable energy resources in electricity production. Distributed generation based on solar energy (photovoltaic and solar thermal), wind, biomass, mini-hydro along with use of fuel cell and micro turbine will gain substantial momentum in the near future. Technically, DG can be a viable solution for the issue of the integration of renewable or non-conventional energy resources. Basically, DG sources can be connected to local power system through power electronic devices, i.e. inverters or ac-ac converters. The interconnection of DG systems to power system as a compensator or a power source with high quality performance is the main aim of this study. Source and load unbalance, load non-linearity, interharmonic distortion, supply voltage distortion, distortion at the point of common coupling in weak source cases, source current power factor, and synchronism of generated currents or voltages are the issues of concern. The interconnection of DG sources shall be carried out by using power electronics switching devices that inject high frequency components rather than the desired current. Also, noise and harmonic distortions can impact the performance of the control strategies. To be able to mitigate the negative effect of high frequency and harmonic as well as noise distortion to achieve satisfactory performance of DG systems, new methods of signal parameter estimation have been proposed in this thesis. These methods are based on processing the digital samples of power system signals. Thus, proposing advanced techniques for the digital estimation of signal parameters and methods for the generation of DG reference currents using the estimates provided is the targeted scope of this thesis. An introduction to this research – including a description of the research problem, the literature review and an account of the research progress linking the research papers – is presented in Chapter 1. One of the main parameters of a power system signal is its frequency. Phasor Measurement (PM) technique is one of the renowned and advanced techniques used for the estimation of power system frequency. Chapter 2 focuses on an in-depth analysis conducted on the PM technique to reveal its strengths and drawbacks. The analysis will be followed by a new technique proposed to enhance the speed of the PM technique while the input signal is free of even-order harmonics. The other techniques proposed in this thesis as the novel ones will be compared with the PM technique comprehensively studied in Chapter 2. An algorithm based on the concept of Kalman filtering is proposed in Chapter 3. The algorithm is intended to estimate signal parameters like amplitude, frequency and phase angle in the online mode. The Kalman filter is modified to operate on the output signal of a Finite Impulse Response (FIR) filter designed by a plain summation. The frequency estimation unit is independent from the Kalman filter and uses the samples refined by the FIR filter. The frequency estimated is given to the Kalman filter to be used in building the transition matrices. The initial settings for the modified Kalman filter are obtained through a trial and error exercise. Another algorithm again based on the concept of Kalman filtering is proposed in Chapter 4 for the estimation of signal parameters. The Kalman filter is also modified to operate on the output signal of the same FIR filter explained above. Nevertheless, the frequency estimation unit, unlike the one proposed in Chapter 3, is not segregated and it interacts with the Kalman filter. The frequency estimated is given to the Kalman filter and other parameters such as the amplitudes and phase angles estimated by the Kalman filter is taken to the frequency estimation unit. Chapter 5 proposes another algorithm based on the concept of Kalman filtering. This time, the state parameters are obtained through matrix arrangements where the noise level is reduced on the sample vector. The purified state vector is used to obtain a new measurement vector for a basic Kalman filter applied. The Kalman filter used has similar structure to a basic Kalman filter except the initial settings are computed through an extensive math-work with regards to the matrix arrangement utilized. Chapter 6 proposes another algorithm based on the concept of Kalman filtering similar to that of Chapter 3. However, this time the initial settings required for the better performance of the modified Kalman filter are calculated instead of being guessed by trial and error exercises. The simulations results for the parameters of signal estimated are enhanced due to the correct settings applied. Moreover, an enhanced Least Error Square (LES) technique is proposed to take on the estimation when a critical transient is detected in the input signal. In fact, some large, sudden changes in the parameters of the signal at these critical transients are not very well tracked by Kalman filtering. However, the proposed LES technique is found to be much faster in tracking these changes. Therefore, an appropriate combination of the LES and modified Kalman filtering is proposed in Chapter 6. Also, this time the ability of the proposed algorithm is verified on the real data obtained from a prototype test object. Chapter 7 proposes the other algorithm based on the concept of Kalman filtering similar to those of Chapter 3 and 6. However, this time an optimal digital filter is designed instead of the simple summation FIR filter. New initial settings for the modified Kalman filter are calculated based on the coefficients of the digital filter applied. Also, the ability of the proposed algorithm is verified on the real data obtained from a prototype test object. Chapter 8 uses the estimation algorithm proposed in Chapter 7 for the interconnection scheme of a DG to power network. Robust estimates of the signal amplitudes and phase angles obtained by the estimation approach are used in the reference generation of the compensation scheme. Several simulation tests provided in this chapter show that the proposed scheme can very well handle the source and load unbalance, load non-linearity, interharmonic distortion, supply voltage distortion, and synchronism of generated currents or voltages. The purposed compensation scheme also prevents distortion in voltage at the point of common coupling in weak source cases, balances the source currents, and makes the supply side power factor a desired value.
Resumo:
Virtual prototyping emerges as a new technology to replace existing physical prototypes for product evaluation, which are costly and time consuming to manufacture. Virtualization technology allows engineers and ergonomists to perform virtual builds and different ergonomic analyses on a product. Digital Human Modelling (DHM) software packages such as Siemens Jack, often integrate with CAD systems to provide a virtual environment which allows investigation of operator and product compatibility. Although the integration between DHM and CAD systems allows for the ergonomic analysis of anthropometric design, human musculoskeletal, multi-body modelling software packages such as the AnyBody Modelling System (AMS) are required to support physiologic design. They provide muscular force analysis, estimate human musculoskeletal strain and help address human comfort assessment. However, the independent characteristics of the modelling systems Jack and AMS constrain engineers and ergonomists in conducting a complete ergonomic analysis. AMS is a stand alone programming system without a capability to integrate into CAD environments. Jack is providing CAD integrated human-in-the-loop capability, but without considering musculoskeletal activity. Consequently, engineers and ergonomists need to perform many redundant tasks during product and process design. Besides, the existing biomechanical model in AMS uses a simplified estimation of body proportions, based on a segment mass ratio derived scaling approach. This is insufficient to represent user populations anthropometrically correct in AMS. In addition, sub-models are derived from different sources of morphologic data and are therefore anthropometrically inconsistent. Therefore, an interface between the biomechanical AMS and the virtual human model Jack was developed to integrate a musculoskeletal simulation with Jack posture modeling. This interface provides direct data exchange between the two man-models, based on a consistent data structure and common body model. The study assesses kinematic and biomechanical model characteristics of Jack and AMS, and defines an appropriate biomechanical model. The information content for interfacing the two systems is defined and a protocol is identified. The interface program is developed and implemented through Tcl and Jack-script(Python), and interacts with the AMS console application to operate AMS procedures.
Resumo:
Language-use has proven to be the most complex and complicating of all Internet features, yet people and institutions invest enormously in language and crosslanguage features because they are fundamental to the success of the Internet’s past, present and future. The thesis takes into focus the developments of the latter – features that facilitate and signify linking between or across languages – both in their historical and current contexts. In the theoretical analysis, the conceptual platform of inter-language linking is developed to both accommodate efforts towards a new social complexity model for the co-evolution of languages and language content, as well as to create an open analytical space for language and cross-language related features of the Internet and beyond. The practiced uses of inter-language linking have changed over the last decades. Before and during the first years of the WWW, mechanisms of inter-language linking were at best important elements used to create new institutional or content arrangements, but on a large scale they were just insignificant. This has changed with the emergence of the WWW and its development into a web in which content in different languages co-evolve. The thesis traces the inter-language linking mechanisms that facilitated these dynamic changes by analysing what these linking mechanisms are, how their historical as well as current contexts can be understood and what kinds of cultural-economic innovation they enable and impede. The study discusses this alongside four empirical cases of bilingual or multilingual media use, ranging from television and web services for languages of smaller populations, to large-scale, multiple languages involving web ventures by the British Broadcasting Corporation, the Special Broadcasting Service Australia, Wikipedia and Google. To sum up, the thesis introduces the concepts of ‘inter-language linking’ and the ‘lateral web’ to model the social complexity and co-evolution of languages online. The resulting model reconsiders existing social complexity models in that it is the first that can explain the emergence of large-scale, networked co-evolution of languages and language content facilitated by the Internet and the WWW. Finally, the thesis argues that the Internet enables an open space for language and crosslanguage related features and investigates how far this process is facilitated by (1) amateurs and (2) human-algorithmic interaction cultures.
Resumo:
The public apology to the Forgotten Australians in late 2009 was, for many, the culmination of a long campaign for recognition and justice. The groundswell for this apology was built through a series of submissions which documented the systemic institutionalised abuse and neglect experienced by the Forgotten Australians that has resulted, for some, in life-long disadvantage and marginalisation. Interestingly it seems that rather than the official documents being the catalyst for change and prompting this public apology, it was more often the personal stories of the Forgotten Australians that resonated and over time drew out quite a torrent of support from the public leading up to, during and after the public apology, just as had been the case with the ‘Stolen Generation.’ Research suggests (cite) that the ethics of such national apologies only make sense if their personal stories are seen as a collective responsibility of society, and only carry weight if we understand and seek to Nationally address the trauma experienced by such victims. In the case of the Forgotten Australians, the National Library of Australia’s Forgotten Australians and Former Child Migrants Oral History Project and the National Museum’s Inside project demonstrate commitment to the digitisation of the Forgotten Australians’ stories in order to promote a better public understanding of their experiences, and institutionally (and therefore formally) value them with renewed social importance. Our project builds on this work not by making or collecting more stories, but by examining the role of the internet and digital technologies used in the production and dissemination of individuals’ stories that have already been created during the period of time between the tabling of the senate inquiry, Children in Institutional Care (1999 or 2003?) and a formal National apology being delivered in Federal Parliament by PM Kevin Rudd (9 Nov, 2009?). This timeframe also represents the emergent first decade of Internet use by Australians, including the rapid easily accessible digital technologies and social media tools that were at our disposal, along with the promises the technology claimed to offer — that is that more people would benefit from the social connections these technologies allegedly were giving us.
Resumo:
This essay considers a specific digital ‘archive’ of early Australian children’s literature, known as the Children’s Literature Digital Resources (CLDR), which is located in AustLit: The Australian Literature Resource ( http://www.austlit.edu.au ). We examine what the CLDR collection can tell us about Australia’s history. To narrow the scope, we focus on how Australia was constructing itself as a nation with its own character, or national identity, in texts written for children from the latter part of the nineteenth century up until the end of World War II. Our approach is to consider how early Australian children’s literature included in the CLDR collection rhetorically constructs nation and place, and in so doing constructs an Australian identity for its implied readers.
Resumo:
In 2011 Queensland suffered both floods and cyclones, leaving residents without homes and their communities in ruins (2011). This paper presents how researchers from QUT, who are also members of the Oral History Association of Australia (OHAA) Queensland’s chapter, are using oral history, photographs, videography and digital storytelling to help heal and empower rural communities around the state and how evaluation has become a key element of our research. QUT researchers ran storytelling workshops in the capital city of Brisbane i early 2011, after the city suffered sever flooding. Cyclone Yasi then struck the town of Cardwell (in February 2011) destroying their historical museum and recording equipment. We delivered an 'emergency workshop', offering participants hands on use of the equipment, ethical and interviewing theory, so that the community could start to build a new collection. We included oral history workshops as well as sessions on how best to use a video camera, digital camera and creative writing sessions, so the community would also know how to make 'products' or exhibition pieces out of the interviews they were recording. We returned six months later to conduct follow-up workshops and the material produced by and with the community had been amazing. More funding has now been secured to replicate audio/visual/writing workshops in other remote rural Queensland communities including Townsville, Mackay and Cunnamulla and Toowoomba in 2012, highlighting the need for a multi media approach, to leverage the most out of OH interviews as a mechanism to restore and promote community resilience and pride.
Resumo:
Firms are moving away from decentralized regional offices. Last year the author spoke with a valuer working on the Sunshine Coast for a Brisbane firm. In years past this valuer would have left home in the morning to go to the office, as well as travelling during the day to client sites. Now they get up, have breakfast, change out of their pyjamas (if they have meetings!) and walk into their employer set-up home office to ‘punch-in’. Apart from travel for essential meetings at head office, or for the purpose of on-site inspections, they can attend work, engage with colleagues and clients and never leave home. While this practice may be a cost saving to the firm and a commuter-friendly way of working, it raises a range of issues to be managed.
Resumo:
This project researched the performance of emerging digital technology for high voltage electricity substations that significantly improves safety for staff and reduces the potential impact on the environment of equipment failure. The experimental evaluation used a scale model of a substation control system that incorporated real substation control and networking equipment with real-time simulation of the power system. The outcomes confirm that it is possible to implement Ethernet networks in high voltage substations that meet the needs of utilities; however component-level testing of devices is necessary to achieve this. The assessment results have been used to further develop international standards for substation communication and precision timing.
Resumo:
Organizations make increasingly use of social media in order to compete for customer awareness and improve the quality of their goods and services. Multiple techniques of social media analysis are already in use. Nevertheless, theoretical underpinnings and a sound research agenda are still unavailable in this field at the present time. In order to contribute to setting up such an agenda, we introduce digital social signal processing (DSSP) as a new research stream in IS that requires multi-facetted investigations. Our DSSP concept is founded upon a set of four sequential activities: sensing digital social signals that are emitted by individuals on social media; decoding online data of social media in order to reconstruct digital social signals; matching the signals with consumers’ life events; and configuring individualized goods and service offerings tailored to the individual needs of customers. We further contribute to tying loose ends of different research areas together, in order to frame DSSP as a field for further investigation. We conclude with developing a research agenda.
Resumo:
This PhD practice-led research inquiry sets out to examine and describe how the fluid interactions between memory and time can be rendered via the remediation of my painting and the construction of a digital image archive. My abstract digital art and handcrafted practice is informed by Deleuze and Guattari’s rhizomics of becoming. I aim to show that the technological mobility of my creative strategies produce new conditions of artistic possibility through the mobile principles of rhizomic interconnection, multiplicity and diversity. Subsequently through the ongoing modification of past painting I map how emergent forms and ideas open up new and incisive engagements with the experience of a ‘continual present’. The deployment of new media and cross media processes in my art also deterritorialises the modernist notion of painting as a static and two dimensional spatial object. Instead, it shows painting in a postmodern field of dynamic and transformative intermediality through digital formats of still and moving images that re-imagines the relationship between memory, time and creative practice.
Resumo:
The interaction of Au particles with few layer graphene is of interest for the formation of the next generation of sensing devices(1). In this paper we investigate the coupling of single gold nanoparticles to a graphene sheet, and multiple gold nanoparticles with a graphene sheet using COMSOL Multiphysics. By using these simulations we are able to determine the electric field strength and associated hot-spots for various gold nanoparticle-graphene systems. The Au nanoparticles were modelled as 8 nm diameter spheres on 1.5 nm thick (5 layers) graphene, with properties of graphene obtained from the refractive index data of Weber(2) and the Au refractive index data from Palik(3). The field was incident along the plane of the sheet with polarisation tested for both s and p. The study showed strong localised interaction between the Au and graphene with limited spread; however the double particle case where the graphene sheet separated two Au nanoparticles showed distinct interaction between the particles and graphene. An offset was introduced (up to 4 nm) resulting in much reduced coupling between the opposed particles as the distance apart increased. Findings currently suggest that the graphene layer has limited interaction with incident fields with a single particle present whilst reducing the coupling region to a very fine area when opposing particles are involved. It is hoped that the results of this research will provide insight into graphene-plasmon interactions and spur the development of the next generation of sensing devices.
Resumo:
Beginning in the second half of the 20th century, ICTs transformed many societies from industrial societies in which manufacturing was the central focus, into knowledge societies in which dealing effectively with data and information has become a central element of work (Anderson, 2008). To meet the needs of the knowledge society, universities must reinvent their structures and processes, their curricula and pedagogic practices. In addition to this, of course higher education is itself subject to the sweeping influence of ICTs. But what might effective higher education look like in the 21st century? In designing higher education systems and learning experiences which are responsive to the learning needs of the future and exploit the possibilities offered by ICTs, we can learn much from the existing professional development strategies of people who are already successful in 21st century fields, such as digital media. In this study, I ask: (1) what are the learning challenges faced by digital media professionals in the 21st century? (2) what are the various roles of formal and informal education in their professional learning strategies at present? (3) how do they prefer to acquire needed capabilities? In-depth interviews were undertaken with successful Australian digital media professionals working in micro businesses and SMEs to answer these questions. The strongest thematic grouping that emerged from the interviews related to the need for continual learning and relearning because of the sheer rate of change in the digital media industries. Four dialectical relationships became apparent from the interviewees’ commentaries around the learning imperatives arising out of the immense and continual changes occurring in the digital content industries: (1) currency vs best practice (2) diversification vs specialisation of products and services (3) creative outputs vs commercial outcomes (4) more learning opportunities vs less opportunity to learn. These findings point to the importance of ‘learning how to learn’ as a 21st century capability. The interviewees were ambivalent about university courses as preparation for professional life in their fields. Higher education was described by several interviewees as having relatively little value-add beyond what one described as “really expensive credentialling services.” For all interviewees in this study, informal learning strategies were the preferred methods of acquiring the majority of knowledge and skills, both for ongoing and initial professional development. Informal learning has no ‘curriculum’ per se, and tends to be opportunistic, unstructured, pedagogically agile and far more self-directed than formal learning (Eraut, 2004). In an industry impacted by constant change, informal learning is clearly both essential and ubiquitous. Inspired by the professional development strategies of the digital media professionals in this study, I propose a 21st century model of the university as a broad, open learning ecology, which also includes industry, professionals, users, and university researchers. If created and managed appropriately, the university learning network becomes the conduit and knowledge integrator for the latest research and industry trends, which students and professionals alike can access as needed.