220 resultados para 230101 Mathematical Logic, Set Theory, Lattices And Combinatorics
Resumo:
It is traditional to initialise Kalman filters and extended Kalman filters with estimates of the states calculated directly from the observed (raw) noisy inputs, but unfortunately their performance is extremely sensitive to state initialisation accuracy: good initial state estimates ensure fast convergence whereas poor estimates may give rise to slow convergence or even filter divergence. Divergence is generally due to excessive observation noise and leads to error magnitudes that quickly become unbounded (R.J. Fitzgerald, 1971). When a filter diverges, it must be re initialised but because the observations are extremely poor, re initialised states will have poor estimates. The paper proposes that if neurofuzzy estimators produce more accurate state estimates than those calculated from the observed noisy inputs (using the known state model), then neurofuzzy estimates can be used to initialise the states of Kalman and extended Kalman filters. Filters whose states have been initialised with neurofuzzy estimates should give improved performance by way of faster convergence when the filter is initialised, and when a filter is re started after divergence
Resumo:
Purpose The purpose of this research is to explore the idea of the participatory library in higher education settings. This research aims to address the question, what is a participatory university library? Design/methodology/approach Grounded theory approach was adopted. In-depth individual interviews were conducted with two diverse groups of participants including ten library staff members and six library users. Data collection and analysis were carried out simultaneously and complied with Straussian grounded theory principles and techniques. Findings Three core categories representing the participatory library were found including “community”, “empowerment”, and “experience”. Each category was thoroughly delineated via sub-categories, properties, and dimensions that all together create a foundation for the participatory library. A participatory library model was also developed together with an explanation of model building blocks that provide a deeper understanding of the participatory library phenomenon. Research limitations The research focuses on a specific library system, i.e., academic libraries. Therefore, the research results may not be very applicable to public, special, and school library contexts. Originality/value This is the first empirical study developing a participatory library model. It provides librarians, library managers, researchers, library students, and the library community with a holistic picture of the contemporary library.
Resumo:
In this chapter we introduce and explore the notion of “intentionally enriched awareness”. Intentional enrichment refers to the process of actively engaging users in the awareness process by enabling them to express intentions. We initially look at the phenomenon if sharing intentional information in related collaborative systems. We then explore the concept of intentional enrichment through designing and evaluating the AnyBiff system which allows users to freely create, share and use a variety of biff applications. Biffs are simple representation of pre-defined activities. Users can select biffs to indicate that they are engaged in an activity. We summarise the results of a trial which allowed us to gain insights into the potential of the AnyBiff prototype and the underlying biff concept to implement intentionally enriched awareness. Our findings show that intentional disclosure mechanisms in the form of biffs were successfully used in a variety of contexts. Users actively engaged in the design of a large variety of biffs and explored many different uses of the concept. The study revealed a whole host of issues with regard to intentionally enriched awareness which give valuable insight into the conception and design of future applications in this area.
Resumo:
Efforts to reduce carbon emissions in the buildings sector have been focused on encouraging green design, construction and building operation; however, the business case is not very compelling if considering the energy cost savings alone. In recent years green building has been driven by a sense that it will improve the productivity of occupants,something with much greater economic returns than energy savings. Reducing energy demand in green commercial buildings in a way that encourages greater productivity is not yet well understood as it involves a set of complex and interdependent factors. This project investigates these factors and focuses on the performance of and interaction between: green design elements, internal environmental quality, occupant experience, tenant/leasing agreements, and building regulation and management. This paper suggests six areas of strategic research that are needed to understand how conditions can be created to support productivity in green buildings, and deliver significant energy consumption reductions.
Resumo:
As patterns of media use become more integrated with mobile technologies and multiple screens, a new mode of viewer engagement has emerged in the form of connected viewing, which allows for an array of new relationships between audiences and media texts in the digital space. This exciting new collection brings together twelve original essays that critically engage with the socially-networked, multi-platform, and cloud-based world of today, examining the connected viewing phenomenon across television, film, video games, and social media. The result is a wide-ranging analysis of shifting business models, policy matters, technological infrastructure, new forms of user engagement, and other key trends affecting screen media in the digital era. Connected Viewing contextualizes the dramatic transformations taking place across both media industries and national contexts, and offers students and scholars alike a diverse set of methods and perspectives for studying this critical moment in media culture.
Resumo:
Objective Contemporary research demonstrates the feasibility of assessing therapeutic performance of trainee-therapists through the use of objective measures of client treatment outcome. Further, significant variation between individual therapists based on their client treatment outcomes has been demonstrated. This study sets out to determine whether a reliable composite measure of therapeutic efficiency, effectiveness and early dropout can be developed and used to objectively compare trainee-therapists against each other. Design and methods Treatment outcomes of 611 clients receiving treatment from 58 trainee-therapists enrolled in a professional training programme were tracked with the OQ-45.2 over a 6-year period to assess therapeutic efficiency, therapeutic effectiveness and early client dropout. Results Significant variation between trainee-therapists was observed for each index. Findings of a moderately strong correlation between therapeutic efficiency and effectiveness enabled the ranking of trainee-therapists based upon a composite measure of these indexes. A non-significant correlation was found between early client dropout and measures of therapeutic effectiveness and efficiency. Conclusions The findings stress the importance of utilizing objective measures to track the treatment outcomes. Despite all trainee-therapists being enrolled in the same training programme, significant variation between trainee-therapists' therapeutic efficiency and effectiveness was found to exist. Practitioner points Developing of potential benchmarking tools that enable trainee-therapists, supervisors and educational institutions to quickly assess therapeutic performance can become part of a holistic assessment of a trainee-therapist's clinical development. Despite an inherent optimistic belief that therapists do not cause harm, there appears to be a small and significant proportion of trainee-therapists who consistently evidence little therapeutic change. Considerable variability in trainee-therapists' therapeutic efficiency and effectiveness can exist in the one training programme. Early client dropout may not be associated with therapists' therapeutic effectiveness and efficiency.
Resumo:
Schooling is one of the core experiences of most young people in the Western world. This study examines the ways that students inhabit subjectivities defined in their relationship to some normalised good student. The idea that schools exist to produce students who become good citizens is one of the basic tenets of modernist educational philosophies that dominate the contemporary education world. The school has become a political site where policy, curriculum orientations, expectations and philosophies of education contest for the ‘right’ way to school and be schooled. For many people, schools and schooling only make sense if they resonate with past experiences. The good student is framed within these aspects of cultural understanding. However, this commonsense attitude is based on a hegemonic understanding of the good, rather than the good student as a contingent multiplicity that is produced by an infinite set of discourses and experiences. In this book, author Greg Thompson argues that this understanding of subjectivities and power is crucial if schools are to meet the needs of a rapidly changing and challenging world. As a high school teacher for many years, Thompson often wondered how students responded to complex articulations on how to be a good student. How a student can be considered good is itself an articulation of powerful discourses that compete within the school. Rather than assuming a moral or ethical citizen, this study turns that logic on it on its head to ask students in what ways they can be good within the school. Visions of the good student deployed in various ways in schools act to produce various ways of knowing the self as certain types of subjects. Developing the postmodern theories of Foucault and Deleuze, this study argues that schools act to teach students to know themselves in certain idealised ways through which they are located, and locate themselves, in hierarchical rationales of the good student. Problematising the good student in high schools engages those institutional discourses with the philosophy, history and sociology of education. Asking students how they negotiate or perform their selves within schools challenges the narrow and limiting ways that the good is often understood. By pushing the ontological understandings of the self beyond the modernist philosophies that currently dominate schools and schooling, this study problematises the tendency to see students as fixed, measurable identities (beings) rather than dynamic, evolving performances (becomings). Who is the Good High School Student? is an important book for scholars conducting research on high school education, as well as student-teachers, teacher educators and practicing teachers alike.
Resumo:
Connectedness is a complex idea that seems to mean different things for each individual. For the purposes of this dissertation, connectedness can best be understood as the ways that an individual feels an affiliation with the community of the institution that he/she experiences. This dissertation seeks to uncover the discourses that various stakeholder groups have within the site of a single school concerning connectedness. One of the precepts that this dissertation holds is that connectedness to school has benefits for the individual as learner, the school as a community and potentially the wider community in years to come. This is a theoretical position in the lineage of such theorists as Plato, Rousseau, and Dewey who have argued that education is a transformative practice that could be a tool for solving some of the issues that contemporary societies face. This work uses the theories of Foucault to extend the analysis to argue that connectedness is not a monolithic constant, but rather a complex set of converging and diverging discourses that students must contend with.
Resumo:
This project developed three mathematical models for scheduling ambulances and ambulance crews and proceeded to solve each model for test scenarios based on real data. Results from these models can serve as decision aids for dispatching or relocating ambulances; and for strategic decisions on the ambulance crews needed each shift. This thesis used Flexible Flow Shop Scheduling techniques to formulate strategic, dynamic and real time models. Metaheuristic solutions techniques were applied for a case study with realistic data. These models are suitable for ambulance planners and dispatchers.
Resumo:
Past research on early internationalising firms often examined factors and motivations potentially influencing internationalisation activities separately. The purpose of this paper was to investigate a set of indicators and their interplay with each other. Firstly, the impact of (a) international potential in the form of the depth and diversity of international experience and network contacts was investigated. Secondly, it was examined to what extent (b) motivational factors and (c) firm stages affect the relationship between international potential and internationalisation activities. This paper used longitudinal data from the Comprehensive Australian Study of Entrepreneurial Emergence (CAUSEE). Results suggest that the international potential of a new venture as a whole is a significant determinant of subsequent internationalisation activities. However, having a diverse international experience from a variety of foreign countries appears to be more beneficial than a long-lasting experience from only a limited number of foreign countries. Furthermore, analyses showed that the interplay of high growth ambitions and the depth of international experience positively affect internationalisation activities. Opportunity or necessity driven entrepreneurship, however, neither exaggerate nor weaken the positive relationship between international potentials and internationalisation activities. Similarly, no moderation by firm stages was found.
Resumo:
Introduction For many years concern for public health has transcended the boundaries of the medical sciences and epidemiology. For the last 50 years or so psychologists have been increasingly active in this field. Recently, psychologists have not only begun to see the need to take action to mould health promoting behaviours in individuals, but have also pointed out the need to join in an effort to develop appropriate social, political, economic and institutional conditions which would help to improve the state of public health. Psychologists have postulated the need to distinguish a new subdiscipline of psychology called public health psychology which, together with other disciplines, would further the realization of this goal. In the following article the historical and international context of health psychology and the changing nature of public health are put forward as having important implications for the establishment of a ‘public health psychology’. These implications are addressed in later sections of the article through the description of conceptual and practical framework of public health psychology in which theory, methods and practice are considered. Many aspects of the conceptual and practical framework of public health psychology have relevance to the health social sciences more generally and forming a basis for interdisciplinary work. The framework of public health psychology, together with the obstacles that need to be overcome, are critically examined within an overall approach that contends it is necessary to increase and improve the contribution of health psychology to public health.
Resumo:
This a compilation of lecture notes and tutorial workshops that were prepared for the former QUT unit DLB310 People and Place. This unit introduced second year students to fundamental ideas about environmental psychology and cultural landscape theory for landscape architects.
Resumo:
The climate in the Arctic is changing faster than anywhere else on earth. Poorly understood feedback processes relating to Arctic clouds and aerosol–cloud interactions contribute to a poor understanding of the present changes in the Arctic climate system, and also to a large spread in projections of future climate in the Arctic. The problem is exacerbated by the paucity of research-quality observations in the central Arctic. Improved formulations in climate models require such observations, which can only come from measurements in situ in this difficult-to-reach region with logistically demanding environmental conditions. The Arctic Summer Cloud Ocean Study (ASCOS) was the most extensive central Arctic Ocean expedition with an atmospheric focus during the International Polar Year (IPY) 2007–2008. ASCOS focused on the study of the formation and life cycle of low-level Arctic clouds. ASCOS departed from Longyearbyen on Svalbard on 2 August and returned on 9 September 2008. In transit into and out of the pack ice, four short research stations were undertaken in the Fram Strait: two in open water and two in the marginal ice zone. After traversing the pack ice northward, an ice camp was set up on 12 August at 87°21' N, 01°29' W and remained in operation through 1 September, drifting with the ice. During this time, extensive measurements were taken of atmospheric gas and particle chemistry and physics, mesoscale and boundary-layer meteorology, marine biology and chemistry, and upper ocean physics. ASCOS provides a unique interdisciplinary data set for development and testing of new hypotheses on cloud processes, their interactions with the sea ice and ocean and associated physical, chemical, and biological processes and interactions. For example, the first-ever quantitative observation of bubbles in Arctic leads, combined with the unique discovery of marine organic material, polymer gels with an origin in the ocean, inside cloud droplets suggests the possibility of primary marine organically derived cloud condensation nuclei in Arctic stratocumulus clouds. Direct observations of surface fluxes of aerosols could, however, not explain observed variability in aerosol concentrations, and the balance between local and remote aerosols sources remains open. Lack of cloud condensation nuclei (CCN) was at times a controlling factor in low-level cloud formation, and hence for the impact of clouds on the surface energy budget. ASCOS provided detailed measurements of the surface energy balance from late summer melt into the initial autumn freeze-up, and documented the effects of clouds and storms on the surface energy balance during this transition. In addition to such process-level studies, the unique, independent ASCOS data set can and is being used for validation of satellite retrievals, operational models, and reanalysis data sets.
Resumo:
Since a celebrate linear minimum mean square (MMS) Kalman filter in integration GPS/INS system cannot guarantee the robustness performance, a H(infinity) filtering with respect to polytopic uncertainty is designed. The purpose of this paper is to give an illustration of this application and a contrast with traditional Kalman filter. A game theory H(infinity) filter is first reviewed; next we utilize linear matrix inequalities (LMI) approach to design the robust H(infinity) filter. For the special INS/GPS model, unstable model case is considered. We give an explanation for Kalman filter divergence under uncertain dynamic system and simultaneously investigate the relationship between H(infinity) filter and Kalman filter. A loosely coupled INS/GPS simulation system is given here to verify this application. Result shows that the robust H(infinity) filter has a better performance when system suffers uncertainty; also it is more robust compared to the conventional Kalman filter.
Resumo:
In the field of face recognition, sparse representation (SR) has received considerable attention during the past few years, with a focus on holistic descriptors in closed-set identification applications. The underlying assumption in such SR-based methods is that each class in the gallery has sufficient samples and the query lies on the subspace spanned by the gallery of the same class. Unfortunately, such an assumption is easily violated in the face verification scenario, where the task is to determine if two faces (where one or both have not been seen before) belong to the same person. In this study, the authors propose an alternative approach to SR-based face verification, where SR encoding is performed on local image patches rather than the entire face. The obtained sparse signals are pooled via averaging to form multiple region descriptors, which then form an overall face descriptor. Owing to the deliberate loss of spatial relations within each region (caused by averaging), the resulting descriptor is robust to misalignment and various image deformations. Within the proposed framework, they evaluate several SR encoding techniques: l1-minimisation, Sparse Autoencoder Neural Network (SANN) and an implicit probabilistic technique based on Gaussian mixture models. Thorough experiments on AR, FERET, exYaleB, BANCA and ChokePoint datasets show that the local SR approach obtains considerably better and more robust performance than several previous state-of-the-art holistic SR methods, on both the traditional closed-set identification task and the more applicable face verification task. The experiments also show that l1-minimisation-based encoding has a considerably higher computational cost when compared with SANN-based and probabilistic encoding, but leads to higher recognition rates.