232 resultados para name


Relevância:

10.00% 10.00%

Publicador:

Resumo:

A small scale sculpture that contributes towards my ongoing explorations into how our collective ability to sustain (the future) is as much a cultural problematic as it is an economic or technological one. The curatorial brief of the project was a technical one - in that each curated artist was to design a piece in CAD suitable for 3D resin printing - The object should be entirely generated through 3D visualisation and modelling tools and should be machined and shipped within the dimensions of 6cm x 6cm x 6cm. My design for this brief was influenced by recent research I had conducted in Mildura in the Sunraysia irrigated region of NW Victoria. Each name set within the work is an Australian soldier/settler – who, on returning from the ‘Great War’ was duly awarded a ‘block’ in Australia’s new inland irrigated settlements - with the explicit task of clearing it to plant and reap. Through their concerted and well-intentioned efforts, these workers began to profoundly re-shape Australia’s marginal country - inadvertently presaging the bleak future faced today by many of Australia’s inland lands and river systems. Furthermore, through that time's predominant colonial conception of ‘terra nullius’ (this land is unoccupied and therefore free to be claimed) they each played a small but formative part in building the profound cultural divide between land and peoples that still haunts Australia today. THE EXHIBITION: Inside Out is a compelling international touring exhibition featuring forty-six miniature sculptures produced in resin using 3D printing technologies. Developments in virtual computer visualisation and integrated digital technologies are giving contemporary makers new insight and opportunities to create objects and forms which were previously impossible to produce or difficult to envisage. The exhibition is the result of collaboration between the Art Technology Coalition, the University of Technology Sydney and RMIT University in Australia along with De Montfort University, Manchester Metropolitan University and Dartington College of Arts at University College Falmouth in the United Kingdom.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

When the colonisers first came to Australia there was an urgent desire to map, name and settle. This desire, in part, stemmed from a fear of the unknown. Once these tasks were completed it was thought that a sense of identity and belonging would automatically come. In Anglo-Australian geography the map of Australia was always perceived in relationship to the larger map of Europe and Britain. The quicker Australia could be mapped the quicker its connection with the ‘civilised’ world could be established. Official maps could be taken up in official history books and a detailed monumental history could begin. Australians would feel secure in where they were placed in the world. However, this was not the case and anxieties about identity and belonging remained. One of the biggest hurdles was the fear of the open spaces and not knowing how to move across the land. Attempts to transpose colonisers’ use of space onto the Australian landscape did not work and led to confusion. Using authors who are often perceived as writers of national fictions (Henry Lawson, Barbara Baynton, Patrick White, David Malouf and Peter Carey) I will reveal how writing about space becomes a way to create a sense of belonging. It is through spatial knowledge and its application that we begin to gain a sense of closeness and identity. I will also look at how one of the greatest fears for the colonisers was the Aboriginal spatial command of the country. Aborigines already had a strongly developed awareness of spatial belonging and their stories reveal this authority (seen in the work of Lorna Little, Mick McLean) Colonisers attempted to discredit this knowledge but the stories and the land continue to recognise its legitimacy. From its beginning Australian spaces have been spaces of hybridity and the more the colonisers attempted to force predetermined structures onto these spaces the more hybrid they became.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Product placement is a fast growing multi-billion dollar industry yet measures of its effectiveness, which influence the critical area of pricing, have been problematic. Past attempts to measure the effect of a placement, and therefore provide a basis for pricing of placements, have been confounded by the effect on consumers of multiple prior exposures of a brand name in all marketing communications. Virtual product placement offers certain advantages: as a tool to measure the effectiveness of product placements; assistance with the problem of lack of audience selectivity in traditional product placement; testing different audiences for brands and addressing a gap in the existing academic literature by focusing on the impact of product placement on recall and recognition of new brands.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

In an environment where it has become increasingly difficult to attract consumer attention, marketers have begun to explore alternative forms of marketing communication. One such form that has emerged is product placement, which has more recently appeared in electronic games. Given changes in media consumption and the growth of the games industry, it is not surprising that games are being exploited as a medium for promotional content. Other market developments are also facilitating and encouraging their use, in terms of both the insertion of brand messages into video games and the creation of brand-centred environments, labelled ‘advergames’. However, while there is much speculation concerning the beneficial outcomes for marketers, there remains a lack of academic work in this area and little empirical evidence of the actual effects of this form of promotion on game players. Only a handful of studies are evident in the literature, which have explored the influence of game placements on consumers. The majority have studied their effect on brand awareness, largely demonstrating that players can recall placed brands. Further, most research conducted to date has focused on computer and online games, but consoles represent the dominant platform for play (Taub, 2004). Finally, advergames have largely been neglected, particularly those in a console format. Widening the gap in the literature is the fact that insufficient academic attention has been given to product placement as a marketing communication strategy overall, and to games in general. The unique nature of the strategy also makes it difficult to apply existing literature to this context. To address a significant need for information in both the academic and business domains, the current research investigates the effects of brand and product placements in video games and advergames on consumer attitude to the brand and corporate image. It was conducted in two stages. Stage one represents a pilot study. It explored the effects of use simulated and peripheral placements in video games on players’ and observers’ attitudinal responses, and whether these are influenced by involvement with a product category or skill level in the game. The ability of gamers to recall placed brands was also examined. A laboratory experiment was employed with a small sample of sixty adult subjects drawn from an Australian east-coast university, some of who were exposed to a console video game on a television set. The major finding of study one is that placements in a video game have no effect on gamers’ attitudes, but they are recalled. For stage two of the research, a field experiment was conducted with a large, random sample of 350 student respondents to investigate the effects on players of brand and product placements in handheld video games and advergames. The constructs of brand attitude and corporate image were again tested, along with several potential confounds. Consistent with the pilot, the results demonstrate that product placement in electronic games has no effect on players’ brand attitudes or corporate image, even when allowing for their involvement with the product category, skill level in the game, or skill level in relation to the medium. Age and gender also have no impact. However, the more interactive a player perceives the game to be, the higher their attitude to the placed brand and corporate image of the brand manufacturer. In other words, when controlling for perceived interactivity, players experienced more favourable attitudes, but the effect was so weak it probably lacks practical significance. It is suggested that this result can be explained by the existence of excitation transfer, rather than any processing of placed brands. The current research provides strong, empirical evidence that brand and product placements in games do not produce strong attitudinal responses. It appears that the nature of the game medium, game playing experience and product placement impose constraints on gamer motivation, opportunity and ability to process these messages, thereby precluding their impact on attitude to the brand and corporate image. Since this is the first study to investigate the ability of video game and advergame placements to facilitate these deeper consumer responses, further research across different contexts is warranted. Nevertheless, the findings have important theoretical and managerial implications. This investigation makes a number of valuable contributions. First, it is relevant to current marketing practice and presents findings that can help guide promotional strategy decisions. It also presents a comprehensive review of the games industry and associated activities in the marketplace, relevant for marketing practitioners. Theoretically, it contributes new knowledge concerning product placement, including how it should be defined, its classification within the existing communications framework, its dimensions and effects. This is extended to include brand-centred entertainment. The thesis also presents the most comprehensive analysis available in the literature of how placements appear in games. In the consumer behaviour discipline, the research builds on theory concerning attitude formation, through application of MacInnis and Jaworski’s (1989) Integrative Attitude Formation Model. With regards to the games literature, the thesis provides a structured framework for the comparison of games with different media types; it advances understanding of the game medium, its characteristics and the game playing experience; and provides insight into console and handheld games specifically, as well as interactive environments generally. This study is the first to test the effects of interactivity in a game environment, and presents a modified scale that can be used as part of future research. Methodologically, it addresses the limitations of prior research through execution of a field experiment and observation with a large sample, making this the largest study of product placement in games available in the literature. Finally, the current thesis offers comprehensive recommendations that will provide structure and direction for future study in this important field.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Since the 1960s, the value relevance of accounting information has been an important topic in accounting research. The value relevance research provides evidence as to whether accounting numbers relate to corporate value in a predicted manner (Beaver, 2002). Such research is not only important for investors but also provides useful insights into accounting reporting effectiveness for standard setters and other users. Both the quality of accounting standards used and the effectiveness associated with implementing these standards are fundamental prerequisites for high value relevance (Hellstrom, 2006). However, while the literature comprehensively documents the value relevance of accounting information in developed markets, little attention has been given to emerging markets where the quality of accounting standards and their enforcement are questionable. Moreover, there is currently no known research that explores the association between level of compliance with International Financial Reporting Standards (IFRS) and the value relevance of accounting information. Motivated by the lack of research on the value relevance of accounting information in emerging markets and the unique institutional setting in Kuwait, this study has three objectives. First, it investigates the extent of compliance with IFRS with respect to firms listed on the Kuwait Stock Exchange (KSE). Second, it examines the value relevance of accounting information produced by KSE-listed firms over the 1995 to 2006 period. The third objective links the first two and explores the association between the level of compliance with IFRS and the value relevance of accounting information to market participants. Since it is among the first countries to adopt IFRS, Kuwait provides an ideal setting in which to explore these objectives. In addition, the Kuwaiti accounting environment provides an interesting regulatory context in which each KSE-listed firm is required to appoint at least two external auditors from separate auditing firms. Based on the research objectives, five research questions (RQs) are addressed. RQ1 and RQ2 aim to determine the extent to which KSE-listed firms comply with IFRS and factors contributing to variations in compliance levels. These factors include firm attributes (firm age, leverage, size, profitability, liquidity), the number of brand name (Big-4) auditing firms auditing a firm’s financial statements, and industry categorization. RQ3 and RQ4 address the value relevance of IFRS-based financial statements to investors. RQ5 addresses whether the level of compliance with IFRS contributes to the value relevance of accounting information provided to investors. Based on the potential improvement in value relevance from adopting and complying with IFRS, it is predicted that the higher the level of compliance with IFRS, the greater the value relevance of book values and earnings. The research design of the study consists of two parts. First, in accordance with prior disclosure research, the level of compliance with mandatory IFRS is examined using a disclosure index. Second, the value relevance of financial statement information, specifically, earnings and book value, is examined empirically using two valuation models: price and returns models. The combined empirical evidence that results from the application of both models provides comprehensive insights into value relevance of accounting information in an emerging market setting. Consistent with expectations, the results show the average level of compliance with IFRS mandatory disclosures for all KSE-listed firms in 2006 was 72.6 percent; thus, indicating KSE-listed firms generally did not fully comply with all requirements. Significant variations in the extent of compliance are observed among firms and across accounting standards. As predicted, older, highly leveraged, larger, and profitable KSE-listed firms are more likely to comply with IFRS required disclosures. Interestingly, significant differences in the level of compliance are observed across the three possible auditor combinations of two Big-4, two non-Big 4, and mixed audit firm types. The results for the price and returns models provide evidence that earnings and book values are significant factors in the valuation of KSE-listed firms during the 1995 to 2006 period. However, the results show that the value relevance of earnings and book values decreased significantly during that period, suggesting that investors rely less on financial statements, possibly due to the increase in the available non-financial statement sources. Notwithstanding this decline, a significant association is observed between the level of compliance with IFRS and the value relevance of earnings and book value to KSE investors. The findings make several important contributions. First, they raise concerns about the effectiveness of the regulatory body that oversees compliance with IFRS in Kuwait. Second, they challenge the effectiveness of the two-auditor requirement in promoting compliance with regulations as well as the associated cost-benefit of this requirement for firms. Third, they provide the first known empirical evidence linking the level of IFRS compliance with the value relevance of financial statement information. Finally, the findings are relevant for standard setters and for their current review of KSE regulations. In particular, they highlight the importance of establishing and maintaining adequate monitoring and enforcement mechanisms to ensure compliance with accounting standards. In addition, the finding that stricter compliance with IFRS improves the value relevance of accounting information highlights the importance of full compliance with IFRS and not just mere adoption.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

This thesis is a work of creative practice-led research comprising two components. The first component is a speculative thriller novel, entitled Diamond Eyes. (Contracted for publication in 2009 by Harper Collins: Voyager as the first in a trilogy, under the name AA Bell.) The second component is an exegesis exploring the notion of re-visioning a novel. Re-visioning, not to be confused with revision, refers to advance editing strategies required when the original vision of a novel changes during development.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

The Mount Isa Basin is a new concept used to describe the area of Palaeo- to Mesoproterozoic rocks south of the Murphy Inlier and inappropriately described presently as the Mount Isa Inlier. The new basin concept presented in this thesis allows for the characterisation of basin-wide structural deformation, correlation of mineralisation with particular lithostratigraphic and seismic stratigraphic packages, and the recognition of areas with petroleum exploration potential. The northern depositional margin of the Mount Isa Basin is the metamorphic, intrusive and volcanic complex here referred to as the Murphy Inlier (not the "Murphy Tectonic Ridge"). The eastern, southern and western boundaries of the basin are obscured by younger basins (Carpentaria, Eromanga and Georgina Basins). The Murphy Inlier rocks comprise the seismic basement to the Mount Isa Basin sequence. Evidence for the continuity of the Mount Isa Basin with the McArthur Basin to the northwest and the Willyama Block (Basin) at Broken Hill to the south is presented. These areas combined with several other areas of similar age are believed to have comprised the Carpentarian Superbasin (new term). The application of seismic exploration within Authority to Prospect (ATP) 423P at the northern margin of the basin was critical to the recognition and definition of the Mount Isa Basin. The Mount Isa Basin is structurally analogous to the Palaeozoic Arkoma Basin of Illinois and Arkansas in southern USA but, as with all basins it contains unique characteristics, a function of its individual development history. The Mount Isa Basin evolved in a manner similar to many well described, Phanerozoic plate tectonic driven basins. A full Wilson Cycle is recognised and a plate tectonic model proposed. The northern Mount Isa Basin is defined as the Proterozoic basin area northwest of the Mount Gordon Fault. Deposition in the northern Mount Isa Basin began with a rift sequence of volcaniclastic sediments followed by a passive margin drift phase comprising mostly carbonate rocks. Following the rift and drift phases, major north-south compression produced east-west thrusting in the south of the basin inverting the older sequences. This compression produced an asymmetric epi- or intra-cratonic clastic dominated peripheral foreland basin provenanced in the south and thinning markedly to a stable platform area (the Murphy Inlier) in the north. The fmal major deformation comprised east-west compression producing north-south aligned faults that are particularly prominent at Mount Isa. Potential field studies of the northern Mount Isa Basin, principally using magnetic data (and to a lesser extent gravity data, satellite images and aerial photographs) exhibit remarkable correlation with the reflection seismic data. The potential field data contributed significantly to the unravelling of the northern Mount Isa Basin architecture and deformation. Structurally, the Mount Isa Basin consists of three distinct regions. From the north to the south they are the Bowthorn Block, the Riversleigh Fold Zone and the Cloncurry Orogen (new names). The Bowthom Block, which is located between the Elizabeth Creek Thrust Zone and the Murphy Inlier, consists of an asymmetric wedge of volcanic, carbonate and clastic rocks. It ranges from over 10 000 m stratigraphic thickness in the south to less than 2000 min the north. The Bowthorn Block is relatively undeformed: however, it contains a series of reverse faults trending east-west that are interpreted from seismic data to be down-to-the-north normal faults that have been reactivated as thrusts. The Riversleigh Fold Zone is a folded and faulted region south of the Bowthorn Block, comprising much of the area formerly referred to as the Lawn Hill Platform. The Cloncurry Orogen consists of the area and sequences equivalent to the former Mount Isa Orogen. The name Cloncurry Orogen clearly distinguishes this area from the wider concept of the Mount Isa Basin. The South Nicholson Group and its probable correlatives, the Pilpah Sandstone and Quamby Conglomerate, comprise a later phase of now largely eroded deposits within the Mount Isa Basin. The name South Nicholson Basin is now outmoded as this terminology only applied to the South Nicholson Group unlike the original broader definition in Brown et al. (1968). Cored slimhole stratigraphic and mineral wells drilled by Amoco, Esso, Elf Aquitaine and Carpentaria Exploration prior to 1986, penetrated much of the stratigraphy and intersected both minor oil and gas shows plus excellent potential source rocks. The raw data were reinterpreted and augmented with seismic stratigraphy and source rock data from resampled mineral and petroleum stratigraphic exploration wells for this study. Since 1986, Comalco Aluminium Limited, as operator of a joint venture with Monument Resources Australia Limited and Bridge Oil Limited, recorded approximately 1000 km of reflection seismic data within the basin and drilled one conventional stratigraphic petroleum well, Beamesbrook-1. This work was the first reflection seismic and first conventional petroleum test of the northern Mount Isa Basin. When incorporated into the newly developed foreland basin and maturity models, a grass roots petroleum exploration play was recognised and this led to the present thesis. The Mount Isa Basin was seen to contain excellent source rocks coupled with potential reservoirs and all of the other essential aspects of a conventional petroleum exploration play. This play, although high risk, was commensurate with the enormous and totally untested petroleum potential of the basin. The basin was assessed for hydrocarbons in 1992 with three conventional exploration wells, Desert Creek-1, Argyle Creek-1 and Egilabria-1. These wells also tested and confrrmed the proposed basin model. No commercially viable oil or gas was encountered although evidence of its former existence was found. In addition to the petroleum exploration, indeed as a consequence of it, the association of the extensive base metal and other mineralisation in the Mount Isa Basin with hydrocarbons could not be overlooked. A comprehensive analysis of the available data suggests a link between the migration and possible generation or destruction of hydrocarbons and metal bearing fluids. Consequently, base metal exploration based on hydrocarbon exploration concepts is probably. the most effective technique in such basins. The metal-hydrocarbon-sedimentary basin-plate tectonic association (analogous to Phanerozoic models) is a compelling outcome of this work on the Palaeo- to Mesoproterozoic Mount lsa Basin. Petroleum within the Bowthom Block was apparently destroyed by hot brines that produced many ore deposits elsewhere in the basin.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

My research investigates why nouns are learned disproportionately more frequently than other kinds of words during early language acquisition (Gentner, 1982; Gleitman, et al., 2004). This question must be considered in the context of cognitive development in general. Infants have two major streams of environmental information to make meaningful: perceptual and linguistic. Perceptual information flows in from the senses and is processed into symbolic representations by the primitive language of thought (Fodor, 1975). These symbolic representations are then linked to linguistic input to enable language comprehension and ultimately production. Yet, how exactly does perceptual information become conceptualized? Although this question is difficult, there has been progress. One way that children might have an easier job is if they have structures that simplify the data. Thus, if particular sorts of perceptual information could be separated from the mass of input, then it would be easier for children to refer to those specific things when learning words (Spelke, 1990; Pylyshyn, 2003). It would be easier still, if linguistic input was segmented in predictable ways (Gentner, 1982; Gleitman, et al., 2004) Unfortunately the frequency of patterns in lexical or grammatical input cannot explain the cross-cultural and cross-linguistic tendency to favor nouns over verbs and predicates. There are three examples of this failure: 1) a wide variety of nouns are uttered less frequently than a smaller number of verbs and yet are learnt far more easily (Gentner, 1982); 2) word order and morphological transparency offer no insight when you contrast the sentence structures and word inflections of different languages (Slobin, 1973) and 3) particular language teaching behaviors (e.g. pointing at objects and repeating names for them) have little impact on children's tendency to prefer concrete nouns in their first fifty words (Newport, et al., 1977). Although the linguistic solution appears problematic, there has been increasing evidence that the early visual system does indeed segment perceptual information in specific ways before the conscious mind begins to intervene (Pylyshyn, 2003). I argue that nouns are easier to learn because their referents directly connect with innate features of the perceptual faculty. This hypothesis stems from work done on visual indexes by Zenon Pylyshyn (2001, 2003). Pylyshyn argues that the early visual system (the architecture of the "vision module") segments perceptual data into pre-conceptual proto-objects called FINSTs. FINSTs typically correspond to physical things such as Spelke objects (Spelke, 1990). Hence, before conceptualization, visual objects are picked out by the perceptual system demonstratively, like a finger pointing indicating ‘this’ or ‘that’. I suggest that this primitive system of demonstration elaborates on Gareth Evan's (1982) theory of nonconceptual content. Nouns are learnt first because their referents attract demonstrative visual indexes. This theory also explains why infants less often name stationary objects such as plate or table, but do name things that attract the focal attention of the early visual system, i.e., small objects that move, such as ‘dog’ or ‘ball’. This view leaves open the question how blind children learn words for visible objects and why children learn category nouns (e.g. 'dog'), rather than proper nouns (e.g. 'Fido') or higher taxonomic distinctions (e.g. 'animal').

Relevância:

10.00% 10.00%

Publicador:

Resumo:

The concept of asset management is not a new but an evolving idea that has been attracting attention of many organisations operating and/or owning some kind of infrastructure assets. The term asset management have been used widely with fundamental differences in interpretation and usage. Regardless of the context of the usage of the term, asset management implies the process of optimising return by scrutinising performance and making key strategic decisions throughout all phases of an assets lifecycle (Sarfi and Tao, 2004). Hence, asset management is a philosophy and discipline through which organisations are enabled to more effectively deploy their resources to provide higher levels of customer service and reliability while balancing financial objectives. In Australia, asset management made its way into the public works in 1993 when the Australian Accounting Standard Board issued the Australian Accounting Standard 27 – AAS27. Standard AAS27 required government agencies to capitalise and depreciate assets rather than expense them against earnings. This development has indirectly forced organisations managing infrastructure assets to consider the useful life and cost effectiveness of asset investments. The Australian State Treasuries and the Australian National Audit Office was the first organisation to formalise the concepts and principles of asset management in Australia in which they defined asset management as “ a systematic, structured process covering the whole life of an asset”(Australian National Audit Office, 1996). This initiative led other Government bodies and industry sectors to develop, refine and apply the concept of asset management in the management of their respective infrastructure assets. Hence, it can be argued that the concept of asset management has emerged as a separate and recognised field of management during the late 1990s. In comparison to other disciplines such as construction, facilities, maintenance, project management, economics, finance, to name a few, asset management is a relatively new discipline and is clearly a contemporary topic. The primary contributors to the literature in asset management are largely government organisations and industry practitioners. These contributions take the form of guidelines and reports on the best practice of asset management. More recently, some of these best practices have been made to become a standard such as the PAS 55 (IAM, 2004, IAM, 2008b) in UK. As such, current literature in this field tends to lack well-grounded theories. To-date, while receiving relatively more interest and attention from empirical researchers, the advancement of this field, particularly in terms of the volume of academic and theoretical development is at best moderate. A plausible reason for the lack of advancement is that many researchers and practitioners are still unaware of, or unimpressed by, the contribution that asset management can make to the performance of infrastructure asset. This paper seeks to explore the practices of organisations that manage infrastructure assets to develop a framework of strategic infrastructure asset management processes. It will begin by examining the development of asset management. This is followed by the discussion on the method to be adopted for this paper. Next, is the discussion of the result form case studies. It first describes the goals of infrastructure asset management and how they can support the broader business goals. Following this, a set of core processes that can support the achievement of business goals are provided. These core processes are synthesised based on the practices of asset managers in the case study organisations.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Report for City Design, for Environment and Parks, within the Brisbane City Council. Context of this Project A Conservation Study for the Old Brisbane Botanic Gardens, formerly called the Brisbane City Botanic Gardens, was finalised in 1995 and prepared by Jeannie Sim for the Landscape Section of Brisbane City Council, the same author of the present report. This unpublished report was the first conservation plan prepared for the place and it was recommended that it be reviewed in five years time. That time has arrived finally with the preparation of the 2005 Review. The present project was commissioned by City Design on behalf of Environment and Parks Section of Brisbane City Council. The author has purposely chosen to call the study site the 'Old Brisbane Botanic Gardens' (OBBG) to differentiate it from the Brisbane Botanic Gardens, Mt. Coot-tha (BBG-MC), and to maintain the claim for this original garden to remain as a botanic garden for Brisbane. This name immediately brings to mind an association with history, as in the precedent set by the naming of the nearby 'Old Government House' at Gardens Point.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Groundwater is increasingly recognised as an important yet vulnerable natural resource, and a key consideration in water cycle management. However, communication of sub-surface water system behaviour, as an important part of encouraging better water management, is visually difficult. Modern 3D visualisation techniques can be used to effectively communicate these complex behaviours to engage and inform community stakeholders. Most software developed for this purpose is expensive and requires specialist skills. The Groundwater Visualisation System (GVS) developed by QUT integrates a wide range of surface and sub-surface data, to produce a 3D visualisation of the behaviour, structure and connectivity of groundwater/surface water systems. Surface data (elevation, surface water, land use, vegetation and geology) and data collected from boreholes (bore locations and subsurface geology) are combined to visualise the nature, structure and connectivity of groundwater/surface water systems. Time-series data (water levels, groundwater quality, rainfall, stream flow and groundwater abstraction) is displayed as an animation within the 3D framework, or graphically, to show water system condition changes over time. GVS delivers an interactive, stand-alone 3D Visualisation product that can be used in a standard PC environment. No specialised training or modelling skills are required. The software has been used extensively in the SEQ region to inform and engage both water managers and the community alike. Examples will be given of GVS visualisations developed in areas where there have been community concerns around groundwater over-use and contamination.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Before making a security or privacy decision, Internet users should evaluate several security indicators in their browser, such as the use of HTTPS (indicated via the lock icon), the domain name of the site, and information from extended validation certificates. However, studies have shown that human subjects infrequently employ these indicators, relying on other indicators that can be spoofed and convey no cryptographic assurances. We identify four simple security indicators that accurately represent security properties of the connection and then examine 125 popular websites to determine if the sites' designs result in correctly displayed security indicators during login. In the vast majority of cases, at least some security indicators are absent or suboptimal. This suggests users are becoming habituated to ignoring recommended security indicators.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

The Streaming SIMD extension (SSE) is a special feature that is available in the Intel Pentium III and P4 classes of microprocessors. As its name implies, SSE enables the execution of SIMD (Single Instruction Multiple Data) operations upon 32-bit floating-point data therefore, performance of floating-point algorithms can be improved. In electrified railway system simulation, the computation involves the solving of a huge set of simultaneous linear equations, which represent the electrical characteristic of the railway network at a particular time-step and a fast solution for the equations is desirable in order to simulate the system in real-time. In this paper, we present how SSE is being applied to the railway network simulation.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Rapid prototyping (RP) is a common name for several techniques, which read in data from computer-aided design (CAD) drawings and manufacture automatically threedimensional objects layer-by-layer according to the virtual design. The utilization of RP in tissue engineering enables the production of three-dimensional scaffolds with complex geometries and very fine structures. Adding micro- and nanometer details into the scaffolds improves the mechanical properties of the scaffold and ensures better cell adhesion to the scaffold surface. Thus, tissue engineering constructs can be customized according to the data acquired from the medical scans to match the each patient’s individual needs. In addition RP enables the control of the scaffold porosity making it possible to fabricate applications with desired structural integrity. Unfortunately, every RP process has its own unique disadvantages in building tissue engineering scaffolds. Hence, the future research should be focused into the development of RP machines designed specifically for fabrication of tissue engineering scaffolds, although RP methods already can serve as a link between tissue and engineering.