500 resultados para Network representation
Resumo:
Read through a focus on the remediation of personal photography in the Flickr photosharing website, in this essay I treat vernacular creativity as a field of cultural practice; one that that does not operate inside the institutions or cultural value systems of high culture or the commercial popular media, and yet draws on and is periodically appropriated by these other systems in dynamic and productive ways. Because of its porosity to commercial culture and art practice, this conceptual model of ‘vernacular creativity’ implies a historicised account of ‘ordinary’ or everyday creative practice that accounts for both continuity and change and avoids creating a nostalgic desire for the recuperation of an authentic folk culture. Moving beyond individual creative practice, the essay concludes by considering the unintended consequences of vernacular creativity practiced in online social networks: in particular, the idea of cultural citizenship.
Resumo:
Real-Time Kinematic (RTK) positioning is a technique used to provide precise positioning services at centimetre accuracy level in the context of Global Navigation Satellite Systems (GNSS). While a Network-based RTK (N-RTK) system involves multiple continuously operating reference stations (CORS), the simplest form of a NRTK system is a single-base RTK. In Australia there are several NRTK services operating in different states and over 1000 single-base RTK systems to support precise positioning applications for surveying, mining, agriculture, and civil construction in regional areas. Additionally, future generation GNSS constellations, including modernised GPS, Galileo, GLONASS, and Compass, with multiple frequencies have been either developed or will become fully operational in the next decade. A trend of future development of RTK systems is to make use of various isolated operating network and single-base RTK systems and multiple GNSS constellations for extended service coverage and improved performance. Several computational challenges have been identified for future NRTK services including: • Multiple GNSS constellations and multiple frequencies • Large scale, wide area NRTK services with a network of networks • Complex computation algorithms and processes • Greater part of positioning processes shifting from user end to network centre with the ability to cope with hundreds of simultaneous users’ requests (reverse RTK) There are two major requirements for NRTK data processing based on the four challenges faced by future NRTK systems, expandable computing power and scalable data sharing/transferring capability. This research explores new approaches to address these future NRTK challenges and requirements using the Grid Computing facility, in particular for large data processing burdens and complex computation algorithms. A Grid Computing based NRTK framework is proposed in this research, which is a layered framework consisting of: 1) Client layer with the form of Grid portal; 2) Service layer; 3) Execution layer. The user’s request is passed through these layers, and scheduled to different Grid nodes in the network infrastructure. A proof-of-concept demonstration for the proposed framework is performed in a five-node Grid environment at QUT and also Grid Australia. The Networked Transport of RTCM via Internet Protocol (Ntrip) open source software is adopted to download real-time RTCM data from multiple reference stations through the Internet, followed by job scheduling and simplified RTK computing. The system performance has been analysed and the results have preliminarily demonstrated the concepts and functionality of the new NRTK framework based on Grid Computing, whilst some aspects of the performance of the system are yet to be improved in future work.
Resumo:
This paper presents a model to estimate travel time using cumulative plots. Three different cases considered are i) case-Det, for only detector data; ii) case-DetSig, for detector data and signal controller data and iii) case-DetSigSFR: for detector data, signal controller data and saturation flow rate. The performance of the model for different detection intervals is evaluated. It is observed that detection interval is not critical if signal timings are available. Comparable accuracy can be obtained from larger detection interval with signal timings or from shorter detection interval without signal timings. The performance for case-DetSig and for case-DetSigSFR is consistent with accuracy generally more than 95% whereas, case-Det is highly sensitive to the signal phases in the detection interval and its performance is uncertain if detection interval is integral multiple of signal cycles.
Groundwater flow model of the Logan river alluvial aquifer system Josephville, South East Queensland
Resumo:
The study focuses on an alluvial plain situated within a large meander of the Logan River at Josephville near Beaudesert which supports a factory that processes gelatine. The plant draws water from on site bores, as well as the Logan River, for its production processes and produces approximately 1.5 ML per day (Douglas Partners, 2004) of waste water containing high levels of dissolved ions. At present a series of treatment ponds are used to aerate the waste water reducing the level of organic matter; the water is then used to irrigate grazing land around the site. Within the study the hydrogeology is investigated, a conceptual groundwater model is produced and a numerical groundwater flow model is developed from this. On the site are several bores that access groundwater, plus a network of monitoring bores. Assessment of drilling logs shows the area is formed from a mixture of poorly sorted Quaternary alluvial sediments with a laterally continuous aquifer comprised of coarse sands and fine gravels that is in contact with the river. This aquifer occurs at a depth of between 11 and 15 metres and is overlain by a heterogeneous mixture of silts, sands and clays. The study investigates the degree of interaction between the river and the groundwater within the fluvially derived sediments for reasons of both environmental monitoring and sustainability of the potential local groundwater resource. A conceptual hydrogeological model of the site proposes two hydrostratigraphic units, a basal aquifer of coarse-grained materials overlain by a thick semi-confining unit of finer materials. From this, a two-layer groundwater flow model and hydraulic conductivity distribution was developed based on bore monitoring and rainfall data using MODFLOW (McDonald and Harbaugh, 1988) and PEST (Doherty, 2004) based on GMS 6.5 software (EMSI, 2008). A second model was also considered with the alluvium represented as a single hydrogeological unit. Both models were calibrated to steady state conditions and sensitivity analyses of the parameters has demonstrated that both models are very stable for changes in the range of ± 10% for all parameters and still reasonably stable for changes up to ± 20% with RMS errors in the model always less that 10%. The preferred two-layer model was found to give the more realistic representation of the site, where water level variations and the numerical modeling showed that the basal layer of coarse sands and fine gravels is hydraulically connected to the river and the upper layer comprising a poorly sorted mixture of silt-rich clays and sands of very low permeability limits infiltration from the surface to the lower layer. The paucity of historical data has limited the numerical modelling to a steady state one based on groundwater levels during a drought period and forecasts for varying hydrological conditions (e.g. short term as well as prolonged dry and wet conditions) cannot reasonably be made from such a model. If future modelling is to be undertaken it is necessary to establish a regular program of groundwater monitoring and maintain a long term database of water levels to enable a transient model to be developed at a later stage. This will require a valid monitoring network to be designed with additional bores required for adequate coverage of the hydrogeological conditions at the Josephville site. Further investigations would also be enhanced by undertaking pump testing to investigate hydrogeological properties in the aquifer.
Resumo:
Nonlinearity, uncertainty and subjectivity are the three predominant characteristics of contractors prequalification which cause the process more of an art than a scientific evaluation. A fuzzy neural network (FNN) model, amalgamating both the fuzzy set and neural network theories, has been developed aiming to improve the objectiveness of contractor prequalification. Through the FNN theory, the fuzzy rules as used by the prequalifiers can be identified and the corresponding membership functions can be transformed. Eighty-five cases with detailed decision criteria and rules for prequalifying Hong Kong civil engineering contractors were collected. These cases were used for training (calibrating) and testing the FNN model. The performance of the FNN model was compared with the original results produced by the prequalifiers and those generated by the general feedforward neural network (GFNN, i.e. a crisp neural network) approach. Contractor’s ranking orders, the model efficiency (R2) and the mean absolute percentage error (MAPE) were examined during the testing phase. These results indicate the applicability of the neural network approach for contractor prequalification and the benefits of the FNN model over the GFNN model. The FNN is a practical approach for modelling contractor prequalification.
Resumo:
The selection criteria for contractor pre-qualification are characterized by the co-existence of both quantitative and qualitative data. The qualitative data is non-linear, uncertain and imprecise. An ideal decision support system for contractor pre-qualification should have the ability of handling both quantitative and qualitative data, and of mapping the complicated nonlinear relationship of the selection criteria, such that rational and consistent decisions can be made. In this research paper, an artificial neural network model was developed to assist public clients identifying suitable contractors for tendering. The pre-qualification criteria (variables) were identified for the model. One hundred and twelve real pre-qualification cases were collected from civil engineering projects in Hong Kong, and eighty-eight hypothetical pre-qualification cases were also generated according to the “If-then” rules used by professionals in the pre-qualification process. The results of the analysis totally comply with current practice (public developers in Hong Kong). Each pre-qualification case consisted of input ratings for candidate contractors’ attributes and their corresponding pre-qualification decisions. The training of the neural network model was accomplished by using the developed program, in which a conjugate gradient descent algorithm was incorporated for improving the learning performance of the network. Cross-validation was applied to estimate the generalization errors based on the “re-sampling” of training pairs. The case studies show that the artificial neural network model is suitable for mapping the complicated nonlinear relationship between contractors’ attributes and their corresponding pre-qualification (disqualification) decisions. The artificial neural network model can be concluded as an ideal alternative for performing the contractor pre-qualification task.
Resumo:
Reflective learning is vital for successful practice-led education such as animation, multimedia design and graphic design, and social network sites can accommodate various learning styles for effective reflective learning. In this paper, the researcher studies reflective learning through social network sites with two animation units. These units aim to provide students with an understanding of the tasks and workflows involved in the production of style sheets, character sheets and motion graphics for use in 3D productions for film and television and game design. In particular, an assessment in these units requires students to complete their online reflective journals throughout the semester. The reflective learning has been integrated within the unit design and students are encouraged to reflect weekly learning processes and outcomes. A survey evaluating for students’ learning experience was conducted, and its outcomes indicate that social network site based reflective learning will not be effective without considering students’ learning circumstances and designing peer-to-peer interactions.
Resumo:
The creative industries are important because they are clustered at the point of attraction for a billion or more young people around the world. They're the drivers of demographic, economic and political change. They start from the individual talent of the creative artist and the individual desire and aspiration of the audience. These are the raw materials for innovation, change and emergent culture, scaled up to form new industries and coordinated into global markets based on social networks.
Resumo:
The ability to forecast machinery failure is vital to reducing maintenance costs, operation downtime and safety hazards. Recent advances in condition monitoring technologies have given rise to a number of prognostic models for forecasting machinery health based on condition data. Although these models have aided the advancement of the discipline, they have made only a limited contribution to developing an effective machinery health prognostic system. The literature review indicates that there is not yet a prognostic model that directly models and fully utilises suspended condition histories (which are very common in practice since organisations rarely allow their assets to run to failure); that effectively integrates population characteristics into prognostics for longer-range prediction in a probabilistic sense; which deduces the non-linear relationship between measured condition data and actual asset health; and which involves minimal assumptions and requirements. This work presents a novel approach to addressing the above-mentioned challenges. The proposed model consists of a feed-forward neural network, the training targets of which are asset survival probabilities estimated using a variation of the Kaplan-Meier estimator and a degradation-based failure probability density estimator. The adapted Kaplan-Meier estimator is able to model the actual survival status of individual failed units and estimate the survival probability of individual suspended units. The degradation-based failure probability density estimator, on the other hand, extracts population characteristics and computes conditional reliability from available condition histories instead of from reliability data. The estimated survival probability and the relevant condition histories are respectively presented as “training target” and “training input” to the neural network. The trained network is capable of estimating the future survival curve of a unit when a series of condition indices are inputted. Although the concept proposed may be applied to the prognosis of various machine components, rolling element bearings were chosen as the research object because rolling element bearing failure is one of the foremost causes of machinery breakdowns. Computer simulated and industry case study data were used to compare the prognostic performance of the proposed model and four control models, namely: two feed-forward neural networks with the same training function and structure as the proposed model, but neglected suspended histories; a time series prediction recurrent neural network; and a traditional Weibull distribution model. The results support the assertion that the proposed model performs better than the other four models and that it produces adaptive prediction outputs with useful representation of survival probabilities. This work presents a compelling concept for non-parametric data-driven prognosis, and for utilising available asset condition information more fully and accurately. It demonstrates that machinery health can indeed be forecasted. The proposed prognostic technique, together with ongoing advances in sensors and data-fusion techniques, and increasingly comprehensive databases of asset condition data, holds the promise for increased asset availability, maintenance cost effectiveness, operational safety and – ultimately – organisation competitiveness.
Resumo:
Modern enterprise knowledge management systems typically require distributed approaches and the integration of numerous heterogeneous sources of information. A powerful foundation for these tasks can be Topic Maps, which not only provide a semantic net-like knowledge representation means and the possibility to use ontologies for modelling knowledge structures, but also offer concepts to link these knowledge structures with unstructured data stored in files, external documents etc. In this paper, we present the architecture and prototypical implementation of a Topic Map application infrastructure, the ‘Topic Grid’, which enables transparent, node-spanning access to different Topic Maps distributed in a network.
Resumo:
Networks form a key part of the infrastructure of contemporary governance arrangements and, as such, are likely to continue for some time. Networks can take many forms and be formed for many reasons. Some networks have been explicitly designed to generate a collective response to an issue; some arise from a top down perspective through mandate or coercion; while others rely more heavily on interpersonal relations and doing the right thing. In this paper, these three different perspectives are referred to as the “3I”s: Instrumental, Institutional or Interpersonal. It is proposed that these underlying motivations will affect the process dynamics within the different types of networks in different ways and therefore influence the type of outcomes achieved. This proposition is tested through a number of case studies. An understanding of these differences will lead to more effective design, management and clearer expectations of what can be achieved through networks.
Resumo:
This paper considers some of the implications of the rise of design as a master-metaphor of the information age. It compares the terms 'interaction design' and 'mass communication', suggesting that both can be seen as a contradiction in terms, inappropriately preserving an industrial-age division between producers and consumers. With the shift from mass media to interactive media, semiotic and political power seems to be shifting too - from media producers to designers. This paper argues that it is important for the new discipline of 'interactive design' not to fall into habits of thought inherited from the 'mass' industrial era. Instead it argues for the significance, for designers and producers alike, of what I call 'distributed expertise' -including social network markets, a DIY-culture, user-led innovation, consumer co-created content, and the use of Web 2.0 affordances for social, scientific and creative purposes as well as for entertainment. It considers the importance of the growth of 'distributed expertise' as part of a new paradigm in the growth of knowledge, which has 'evolved' through a number of phases, from 'abstraction' to 'representation', to 'productivity'. In the context of technologically mediated popular participation in the growth of knowledge and social relationships, the paper argues that design and media-production professions need to cross rather than to maintain the gap between experts and everyone else, enabling all the agents in the system to navigate the shift into the paradigm of mass productivity.
Resumo:
Artificial neural networks (ANN) have demonstrated good predictive performance in a wide range of applications. They are, however, not considered sufficient for knowledge representation because of their inability to represent the reasoning process succinctly. This paper proposes a novel methodology Gyan that represents the knowledge of a trained network in the form of restricted first-order predicate rules. The empirical results demonstrate that an equivalent symbolic interpretation in the form of rules with predicates, terms and variables can be derived describing the overall behaviour of the trained ANN with improved comprehensibility while maintaining the accuracy and fidelity of the propositional rules.
Resumo:
Abstract With the phenomenal growth of electronic data and information, there are many demands for the development of efficient and effective systems (tools) to perform the issue of data mining tasks on multidimensional databases. Association rules describe associations between items in the same transactions (intra) or in different transactions (inter). Association mining attempts to find interesting or useful association rules in databases: this is the crucial issue for the application of data mining in the real world. Association mining can be used in many application areas, such as the discovery of associations between customers’ locations and shopping behaviours in market basket analysis. Association mining includes two phases. The first phase, called pattern mining, is the discovery of frequent patterns. The second phase, called rule generation, is the discovery of interesting and useful association rules in the discovered patterns. The first phase, however, often takes a long time to find all frequent patterns; these also include much noise. The second phase is also a time consuming activity that can generate many redundant rules. To improve the quality of association mining in databases, this thesis provides an alternative technique, granule-based association mining, for knowledge discovery in databases, where a granule refers to a predicate that describes common features of a group of transactions. The new technique first transfers transaction databases into basic decision tables, then uses multi-tier structures to integrate pattern mining and rule generation in one phase for both intra and inter transaction association rule mining. To evaluate the proposed new technique, this research defines the concept of meaningless rules by considering the co-relations between data-dimensions for intratransaction-association rule mining. It also uses precision to evaluate the effectiveness of intertransaction association rules. The experimental results show that the proposed technique is promising.
Resumo:
Arabic satellite television has recently attracted tremendous attention in both the academic and professional worlds, with a special interest in Aljazeera as a curious phenomenon in the Arab region. Having made a household name for itself worldwide with the airing of the Bin Laden tapes, Aljazeera has set out to deliberately change the culture of Arabic journalism, as it has been repeatedly stated by its current General Manager Waddah Khanfar, and to shake up the Arab society by raising awareness to issues never discussed on television before and challenging long-established social and cultural values and norms while promoting, as it claims, Arab issues from a presumably Arab perspective. Working within the meta-frame of democracy, this Qatari-based network station has been received with mixed reactions ranging from complete support to utter rejection in both the west and the Arab world. This research examines the social semiotics of Arabic television and the socio-cultural impact of translation-mediated news in Arabic satellite television, with the aim to carry out a qualitative content analysis, informed by framing theory, critical linguistic analysis, social semiotics and translation theory, within a re-mediation framework which rests on the assumption that a medium “appropriates the techniques, forms and social significance of other media and attempts to rival or refashion them in the name of the real" (Bolter and Grusin, 2000: 66). This is a multilayered research into how translation operates at two different yet interwoven levels: translation proper, that is the rendition of discourse from one language into another at the text level, and translation as a broader process of interpretation of social behaviour that is driven by linguistic and cultural forms of another medium resulting in new social signs generated from source meaning reproduced as target meaning that is bound to be different in many respects. The research primarily focuses on the news media, news making and reporting at Arabic satellite television and looks at translation as a reframing process of news stories in terms of content and cultural values. This notion is based on the premise that by its very nature, news reporting is a framing process, which involves a reconstruction of reality into actualities in presenting the news and providing the context for it. In other words, the mediation of perceived reality through a media form, such as television, actually modifies the mind’s ordering and internal representation of the reality that is presented. The research examines the process of reframing through translation news already framed or actualized in another language and argues that in submitting framed news reports to the translation process several alterations take place, driven by the linguistic and cultural constraints and shaped by the context in which the content is presented. These alterations, which involve recontextualizations, may be intentional or unintentional, motivated or unmotivated. Generally, they are the product of lack of awareness of the dynamics and intricacies of turning a message from one language form into another. More specifically, they are the result of a synthesis process that consciously or subconsciously conforms to editorial policy and cultural interpretive frameworks. In either case, the original message is reproduced and the news is reframed. For the case study, this research examines news broadcasts by the now world-renowned Arabic satellite television station Aljazeera, and to a lesser extent the Lebanese Broadcasting Corporation (LBC) and Al- Arabiya where access is feasible, for comparison and crosschecking purposes. As a new phenomenon in the Arab world, Arabic satellite television, especially 24-hour news and current affairs, provides an interesting area worthy of study, not only for its immediate socio-cultural and professional and ethical implications for the Arabic media in particular, but also for news and current affairs production in the western media that rely on foreign language sources and translation mediation for international stories.