229 resultados para Multidimensional projection


Relevância:

10.00% 10.00%

Publicador:

Resumo:

The dynamic interaction between building systems and external climate is extremely complex, involving a large number of difficult-to-predict variables. In order to study the impact of global warming on the built environment, the use of building simulation techniques together with forecast weather data are often necessary. Since all building simulation programs require hourly meteorological input data for their thermal comfort and energy evaluation, the provision of suitable weather data becomes critical. Based on a review of the existing weather data generation models, this paper presents an effective method to generate approximate future hourly weather data suitable for the study of the impact of global warming. Depending on the level of information available for the prediction of future weather condition, it is shown that either the method of retaining to current level, constant offset method or diurnal modelling method may be used to generate the future hourly variation of an individual weather parameter. An example of the application of this method to the different global warming scenarios in Australia is presented. Since there is no reliable projection of possible change in air humidity, solar radiation or wind characters, as a first approximation, these parameters have been assumed to remain at the current level. A sensitivity test of their impact on the building energy performance shows that there is generally a good linear relationship between building cooling load and the changes of weather variables of solar radiation, relative humidity or wind speed.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Knowmore (House of Commons) is a large scale generative interactive installation that incorporates embodied interaction, dynamic image creation, new furniture forms, touch sensitivity, innovative collaborative processes and multichannel generative sound creation. A large circular table spun by hand and a computer-controlled video projection falls on its top, creating an uncanny blend of physical object and virtual media. Participants’ presence around the table and how they touch it is registered, allowing up to five people to collaboratively ‘play’ this deeply immersive audiovisual work. Set within an ecological context, the work subtly asks what kind of resources and knowledges might be necessary to move us past simply knowing what needs to be changed to instead actually embodying that change, whilst hinting at other deeply relational ways of understanding and knowing the world. The work has successfully operated in two high traffic public environments, generating a subtle form of interactivity that allows different people to interact at different paces and speeds and with differing intentions, each contributing towards dramatic public outcomes. The research field involved developing new interaction and engagement strategies for eco-political media arts practice. The context was the creation of improved embodied, performative and improvisational experiences for participants; further informed by ‘Sustainment’ theory. The central question was, what ontological shifts may be necessary to better envision and align our everyday life choices in ways that respect that which is shared by all - 'The Commons'. The methodology was primarily practice-led and in concert with underlying theories. The work’s knowledge contribution was to question how new media interactive experience and embodied interaction might prompt participants to reflect upon the kind of resources and knowledges required to move past simply knowing what needs to be changed to instead actually embodying that change. This was achieved through focusing on the power of embodied learning implied by the works' strongly physical interface (i.e. the spinning of a full size table) in concert with the complex field of layered imagery and sound. The work was commissioned by the State Library of Queensland and Queensland Artworkers Alliance and significantly funded by The Australia Council for the Arts, Arts Queensland, QUT, RMIT Centre for Animation and Interactive Media and industry partners E2E Visuals. After premiering for 3 months at the State Library of Queensland it was curated into the significant ‘Mediations Biennial of Modern Art’ in Poznan, Poland. The work formed the basis of two papers, was reviewed in Realtime (90), was overviewed at Subtle Technologies (2010) in Toronto and shortlisted for ISEA 2011 Istanbul and included in the edited book/catalogue ‘Art in Spite of Economics’, a collaboration between Leonardo/ISAST (MIT Press); Goldsmiths, University of London; ISEA International; and Sabanci University, Istanbul.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Technology-based self-service (TBSS) enables consumers to complete services themselves using a technological interface. As evaluations of consumer satisfaction and commitment have typically focused on interpersonal interactions, the effect of TBSS on these is under researched . This paper explores the impact of TBSS on consumer satisfaction and on a multidimensional measure of consumer commitment.Data are collected from 241 hotel guests. The results suggest personal-service is more important for satisfaction and commitment. This has implications for marketing as the benefits of adopting TBSS are not clear. Multi-dimensional commitment provides some interesting findings and suggests the need for further research into TBSS and commitment.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Design as seen from the designer's perspective is a series of amazing imaginative jumps or creative leaps. But design as seen by the design historian is a smooth progression or evolution of ideas that they seem self-evident and inevitable after the event. But the next step is anything but obvious for the artist/creator/inventor/designer stuck at that point just before the creative leap. They know where they have come from and have a general sense of where they are going, but often do not have a precise target or goal. This is why it is misleading to talk of design as a problem-solving activity - it is better defined as a problem-finding activity. This has been very frustrating for those trying to assist the design process with computer-based, problem-solving techniques. By the time the problem has been defined, it has been solved. Indeed the solution is often the very definition of the problem. Design must be creative-or it is mere imitation. But since this crucial creative leap seem inevitable after the event, the question must arise, can we find some way of searching the space ahead? Of course there are serious problems of knowing what we are looking for and the vastness of the search space. It may be better to discard altogether the term "searching" in the context of the design process: Conceptual analogies such as search, search spaces and fitness landscapes aim to elucidate the design process. However, the vastness of the multidimensional spaces involved make these analogies misguided and they thereby actually result in further confounding the issue. The term search becomes a misnomer since it has connotations that imply that it is possible to find what you are looking for. In such vast spaces the term search must be discarded. Thus, any attempt at searching for the highest peak in the fitness landscape as an optimal solution is also meaningless. Futhermore, even the very existence of a fitness landscape is fallacious. Although alternatives in the same region of the vast space can be compared to one another, distant alternatives will stem from radically different roots and will therefore not be comparable in any straightforward manner (Janssen 2000). Nevertheless we still have this tantalizing possibility that if a creative idea seems inevitable after the event, then somehow might the process be rserved? This may be as improbable as attempting to reverse time. A more helpful analogy is from nature, where it is generally assumed that the process of evolution is not long-term goal directed or teleological. Dennett points out a common minsunderstanding of Darwinism: the idea that evolution by natural selection is a procedure for producing human beings. Evolution can have produced humankind by an algorithmic process, without its being true that evolution is an algorithm for producing us. If we were to wind the tape of life back and run this algorithm again, the likelihood of "us" being created again is infinitesimally small (Gould 1989; Dennett 1995). But nevertheless Mother Nature has proved a remarkably successful, resourceful, and imaginative inventor generating a constant flow of incredible new design ideas to fire our imagination. Hence the current interest in the potential of the evolutionary paradigm in design. These evolutionary methods are frequently based on techniques such as the application of evolutionary algorithms that are usually thought of as search algorithms. It is necessary to abandon such connections with searching and see the evolutionary algorithm as a direct analogy with the evolutionary processes of nature. The process of natural selection can generate a wealth of alternative experiements, and the better ones survive. There is no one solution, there is no optimal solution, but there is continuous experiment. Nature is profligate with her prototyping and ruthless in her elimination of less successful experiments. Most importantly, nature has all the time in the world. As designers we cannot afford prototyping and ruthless experiment, nor can we operate on the time scale of the natural design process. Instead we can use the computer to compress space and time and to perform virtual prototyping and evaluation before committing ourselves to actual prototypes. This is the hypothesis underlying the evolutionary paradigm in design (1992, 1995).

Relevância:

10.00% 10.00%

Publicador:

Resumo:

As editors of the recently published Vocational psychological and organisational perspectives on career: Towards a multidisciplinary dialogue (Collin & Patton, 2009), we have considerable interest in this particular issue of the Australian Journal of Career Development. This short piece will first present the purpose and thesis of that book and, in the light of them, will then comment on the four papers. The book suggests that to understand the multidimensional and multilayered nature of career, “it has to be studied in a similarly multilayered and multi-perspectival way, and, indeed, it has been” (p. 3). Scholars have pointed out that there is a wide array of disciplines including economics, sociology, anthropology, geography, political science, various branches of psychology (e.g. industrial/organisational (I/O), vocational, counselling), psychiatry, education, organisation studies, organisational behaviour, personnel/human resource management, industrial relations, and more, all of which have something to say about career. Of these, the most influential, according to Peiperl and Arthur (2000), have been psychology, sociology, education and management. These many disciplinary perspectives on career constitute the rich field of career studies.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Internal marketing has been discussed in the management and academic literature for over three decades, yet it remains ill defined and poorly operationalized. This paper responds to calls for research to develop a single clear understanding of the construct, for the development of a suitable instrument to measure it, and for empirical evidence of its impact. Existing, divergent conceptualization of internal marketing are explored, and a new, multidimensional construct, describing the managerial behaviors associated with internal marketing is developed, and termed internal market orientation (IMO). IMO represents the adaptation of market orientation to the context of employer-employee exchanges in the internal market. The paper describes the development of a valid and reliable measure of IMO in a retail services context. Five dimensions of IMO are identified and confirmed. These are 1) formal written information generation, 2) formal face-to-face information generation, 3) informal information generation, 4) communication and dissemination of information, and 5) responding to this internal market information. The impact of IMO on important organizational factors is also explored. Results indicate positive consequences for customer satisfaction, relative competitive position, staff attitudes, staff retention and staff compliance.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Changing informational constraints of practice, such as when using ball projection machines, has been shown to significantly affect movement coordination of skilled cricketers. To date, there has been no similar research on movement responses of developing batters, an important issue since ball projection machines are used heavily in cricket development programmes. Timing and coordination of young cricketers (n = 12, age = 15.6 ± 0.7 years) were analyzed during the forward defensive and forward drive strokes when facing a bowling machine and bowler (both with a delivery velocity of 28.14 ± 0.56 m s−1). Significant group performance differences were observed between the practice task constraints, with earlier initiation of the backswing, front foot movement, downswing, and front foot placement when facing the bowler compared to the bowling machine. Peak height of the backswing was higher when facing the bowler, along with a significantly larger step length. Altering the informational constraints of practice caused major changes to the information–movement couplings of developing cricketers. Data from this study were interpreted to emanate from differences in available specifying variables under the distinct practice task constraints. Considered with previous findings, results confirmed the need to ensure representative batting task constraints in practice, cautioning against an over-reliance on ball projection machines in cricket development programmes.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Since the 1980s, industries and researchers have sought to better understand the quality of services due to the rise in their importance (Brogowicz, Delene and Lyth 1990). More recent developments with online services, coupled with growing recognition of service quality (SQ) as a key contributor to national economies and as an increasingly important competitive differentiator, amplify the need to revisit our understanding of SQ and its measurement. Although ‘SQ’ can be broadly defined as “a global overarching judgment or attitude relating to the overall excellence or superiority of a service” (Parasuraman, Berry and Zeithaml 1988), the term has many interpretations. There has been considerable progress on how to measure SQ perceptions, but little consensus has been achieved on what should be measured. There is agreement that SQ is multi-dimensional, but little agreement as to the nature or content of these dimensions (Brady and Cronin 2001). For example, within the banking sector, there exist multiple SQ models, each consisting of varying dimensions. The existence of multiple conceptions and the lack of a unifying theory bring the credibility of existing conceptions into question, and beg the question of whether it is possible at some higher level to define SQ broadly such that it spans all service types and industries. This research aims to explore the viability of a universal conception of SQ, primarily through a careful re-visitation of the services and SQ literature. The study analyses the strengths and weaknesses of the highly regarded and widely used global SQ model (SERVQUAL) which reflects a single-level approach to SQ measurement. The SERVQUAL model states that customers evaluate SQ (of each service encounter) based on five dimensions namely reliability, assurance, tangibles, empathy and responsibility. SERVQUAL, however, failed to address what needs to be reliable, assured, tangible, empathetic and responsible. This research also addresses a more recent global SQ model from Brady and Cronin (2001); the B&C (2001) model, that has potential to be the successor of SERVQUAL in that it encompasses other global SQ models and addresses the ‘what’ questions that SERVQUAL didn’t. The B&C (2001) model conceives SQ as being multidimensional and multi-level; this hierarchical approach to SQ measurement better reflecting human perceptions. In-line with the initial intention of SERVQUAL, which was developed to be generalizable across industries and service types, this research aims to develop a conceptual understanding of SQ, via literature and reflection, that encompasses the content/nature of factors related to SQ; and addresses the benefits and weaknesses of various SQ measurement approaches (i.e. disconfirmation versus perceptions-only). Such understanding of SQ seeks to transcend industries and service types with the intention of extending our knowledge of SQ and assisting practitioners in understanding and evaluating SQ. The candidate’s research has been conducted within, and seeks to contribute to, the ‘IS-Impact’ research track of the IT Professional Services (ITPS) Research Program at QUT. The vision of the track is “to develop the most widely employed model for benchmarking Information Systems in organizations for the joint benefit of research and practice.” The ‘IS-Impact’ research track has developed an Information Systems (IS) success measurement model, the IS-Impact Model (Gable, Sedera and Chan 2008), which seeks to fulfill the track’s vision. Results of this study will help future researchers in the ‘IS-Impact’ research track address questions such as: • Is SQ an antecedent or consequence of the IS-Impact model or both? • Has SQ already been addressed by existing measures of the IS-Impact model? • Is SQ a separate, new dimension of the IS-Impact model? • Is SQ an alternative conception of the IS? Results from the candidate’s research suggest that SQ dimensions can be classified at a higher level which is encompassed by the B&C (2001) model’s 3 primary dimensions (interaction, physical environment and outcome). The candidate also notes that it might be viable to re-word the ‘physical environment quality’ primary dimension to ‘environment quality’ so as to better encompass both physical and virtual scenarios (E.g: web sites). The candidate does not rule out the global feasibility of the B&C (2001) model’s nine sub-dimensions, however, acknowledges that more work has to be done to better define the sub-dimensions. The candidate observes that the ‘expertise’, ‘design’ and ‘valence’ sub-dimensions are supportive representations of the ‘interaction’, physical environment’ and ‘outcome’ primary dimensions respectively. The latter statement suggests that customers evaluate each primary dimension (or each higher level of SQ classification) namely ‘interaction’, physical environment’ and ‘outcome’ based on the ‘expertise’, ‘design’ and ‘valence’ sub-dimensions respectively. The ability to classify SQ dimensions at a higher level coupled with support for the measures that make up this higher level, leads the candidate to propose the B&C (2001) model as a unifying theory that acts as a starting point to measuring SQ and the SQ of IS. The candidate also notes, in parallel with the continuing validation and generalization of the IS-Impact model, that there is value in alternatively conceptualizing the IS as a ‘service’ and ultimately triangulating measures of IS SQ with the IS-Impact model. These further efforts are beyond the scope of the candidate’s study. Results from the candidate’s research also suggest that both the disconfirmation and perceptions-only approaches have their merits and the choice of approach would depend on the objective(s) of the study. Should the objective(s) be an overall evaluation of SQ, the perceptions-only approached is more appropriate as this approach is more straightforward and reduces administrative overheads in the process. However, should the objective(s) be to identify SQ gaps (shortfalls), the (measured) disconfirmation approach is more appropriate as this approach has the ability to identify areas that need improvement.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

The over represented number of novice drivers involved in crashes is alarming. Driver training is one of the interventions aimed at mitigating the number of crashes that involve young drivers. Experienced drivers have better hazard perception ability compared to inexperienced drivers. Eye gaze patterns have been found to be an indicator of the driver's competency level. The aim of this paper is to develop an in-vehicle system which correlates information about the driver's gaze and vehicle dynamics, which is then used to assist driver trainers in assessing driving competency. This system allows visualization of the complete driving manoeuvre data on interactive maps. It uses an eye tracker and perspective projection algorithms to compute the depth of gaze and plots it on Google maps. This interactive map also features the trajectory of the vehicle and turn indicator usage. This system allows efficient and user friendly analysis of the driving task. It can be used by driver trainers and trainees to understand objectively the risks encountered during driving manoeuvres. This paper presents a prototype that plots the driver's eye gaze depth and direction on an interactive map along with the vehicle dynamics information. This prototype will be used in future to study the difference in gaze patterns in novice and experienced drivers prior to a certain manoeuvre.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Driver distraction continues to receive considerable research interest but the drivers‟ perspective is less well documented. The current research focussed on identifying features that are salient to drivers in their risk perception judgements for 19 in-vehicle distractions. Both technological (e.g. mobile phones) and non technological (e.g. eating) distractions were considered. Analysis identified that males and females were rating 7 of the 19 distractions differently. The current paper presents the data for the female participants (n = 84). Multidimensional scaling analysis identified three main dimensions contributing to female drivers‟ risk perception judgements. Qualitative characteristics such as the level of exposure to a distraction were identified as significant contributors to drivers‟ risk perception as well as features inherent in the distractions such as distractions being related to communication. This exploratory work contributes to better understanding female drivers‟ perceptions of risk associated with in-vehicle distractions. Understanding the drivers‟ perspective can help guide the development of road safety messages and ultimately improve the impact of such messages.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

This paper presents Scatter Difference Nuisance Attribute Projection (SD-NAP) as an enhancement to NAP for SVM-based speaker verification. While standard NAP may inadvertently remove desirable speaker variability, SD-NAP explicitly de-emphasises this variability by incorporating a weighted version of the between-class scatter into the NAP optimisation criterion. Experimental evaluation of SD-NAP with a variety of SVM systems on the 2006 and 2008 NIST SRE corpora demonstrate that SD-NAP provides improved verification performance over standard NAP in most cases, particularly at the EER operating point.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Mapping the physical world, the arrangement of continents and oceans, cities and villages, mountains and deserts, while not without its own contentious aspects, can at least draw upon centuries of previous work in cartography and discovery. To map virtual spaces is another challenge altogether. Are cartographic conventions applicable to depictions of the blogosphere, or the internet in general? Is a more mathematical approach required to even start to make sense of the shape of the blogosphere, to understand the network created by and between blogs? With my research comparing information flows in the Australian and French political blogs, visualising the data obtained is important as it can demonstrate the spread of ideas and topics across blogs. However, how best to depict the flows, links, and the spaces between is still unclear. Is network theory and systems of hubs and nodes more relevant than mass communication theories to the research at hand, influencing the nature of any map produced? Is it even a good idea to try and apply boundaries like ‘Australian’ and ‘French’ to parts of a map that does not reflect international borders or the Mercator projection? While drawing upon some of my work-in-progress, this paper will also evaluate previous maps of the blogosphere and approaches to depicting networks of blogs. As such, the paper will provide a greater awareness of the tools available and the strengths and limitations of mapping methodologies, helping to shape the direction of my research in a field still very much under development.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Over the last decade, the rapid growth and adoption of the World Wide Web has further exacerbated user needs for e±cient mechanisms for information and knowledge location, selection, and retrieval. How to gather useful and meaningful information from the Web becomes challenging to users. The capture of user information needs is key to delivering users' desired information, and user pro¯les can help to capture information needs. However, e®ectively acquiring user pro¯les is di±cult. It is argued that if user background knowledge can be speci¯ed by ontolo- gies, more accurate user pro¯les can be acquired and thus information needs can be captured e®ectively. Web users implicitly possess concept models that are obtained from their experience and education, and use the concept models in information gathering. Prior to this work, much research has attempted to use ontologies to specify user background knowledge and user concept models. However, these works have a drawback in that they cannot move beyond the subsumption of super - and sub-class structure to emphasising the speci¯c se- mantic relations in a single computational model. This has also been a challenge for years in the knowledge engineering community. Thus, using ontologies to represent user concept models and to acquire user pro¯les remains an unsolved problem in personalised Web information gathering and knowledge engineering. In this thesis, an ontology learning and mining model is proposed to acquire user pro¯les for personalised Web information gathering. The proposed compu- tational model emphasises the speci¯c is-a and part-of semantic relations in one computational model. The world knowledge and users' Local Instance Reposito- ries are used to attempt to discover and specify user background knowledge. From a world knowledge base, personalised ontologies are constructed by adopting au- tomatic or semi-automatic techniques to extract user interest concepts, focusing on user information needs. A multidimensional ontology mining method, Speci- ¯city and Exhaustivity, is also introduced in this thesis for analysing the user background knowledge discovered and speci¯ed in user personalised ontologies. The ontology learning and mining model is evaluated by comparing with human- based and state-of-the-art computational models in experiments, using a large, standard data set. The experimental results are promising for evaluation. The proposed ontology learning and mining model in this thesis helps to develop a better understanding of user pro¯le acquisition, thus providing better design of personalised Web information gathering systems. The contributions are increasingly signi¯cant, given both the rapid explosion of Web information in recent years and today's accessibility to the Internet and the full text world.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

As climate change will entail new conditions for the built environment, the thermal behaviour of air-conditioned office buildings may also change. Using building computer simulations, the impact of warmer weather is evaluated on the design and performance of air-conditioned office buildings in Australia, including the increased cooling loads and probable indoor temperature increases due to a possibly undersized air-conditioning system, as well as the possible change in energy use. It is found that existing office buildings would generally be able to adapt to the increasing warmth of year 2030 Low and High scenarios projections and the year 2070 Low scenario projection. However, for the 2070 High scenario, the study indicates that the existing office buildings in all capital cities of Australia would suffer from overheating problems. For existing buildings designed for current climate conditions, it is shown that there is a nearly linear correlation between the increase of average external air temperature and the increase of building cooling load. For the new buildings designed for warmer scenarios, a 28-59% increase of cooling capacity under the 2070 High scenario would be required.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Abstract With the phenomenal growth of electronic data and information, there are many demands for the development of efficient and effective systems (tools) to perform the issue of data mining tasks on multidimensional databases. Association rules describe associations between items in the same transactions (intra) or in different transactions (inter). Association mining attempts to find interesting or useful association rules in databases: this is the crucial issue for the application of data mining in the real world. Association mining can be used in many application areas, such as the discovery of associations between customers’ locations and shopping behaviours in market basket analysis. Association mining includes two phases. The first phase, called pattern mining, is the discovery of frequent patterns. The second phase, called rule generation, is the discovery of interesting and useful association rules in the discovered patterns. The first phase, however, often takes a long time to find all frequent patterns; these also include much noise. The second phase is also a time consuming activity that can generate many redundant rules. To improve the quality of association mining in databases, this thesis provides an alternative technique, granule-based association mining, for knowledge discovery in databases, where a granule refers to a predicate that describes common features of a group of transactions. The new technique first transfers transaction databases into basic decision tables, then uses multi-tier structures to integrate pattern mining and rule generation in one phase for both intra and inter transaction association rule mining. To evaluate the proposed new technique, this research defines the concept of meaningless rules by considering the co-relations between data-dimensions for intratransaction-association rule mining. It also uses precision to evaluate the effectiveness of intertransaction association rules. The experimental results show that the proposed technique is promising.