979 resultados para Field concept
Resumo:
The experimental literature and studies using survey data have established that people care a great deal about their relative economic position and not solely, as standard economic theory assumes, about their absolute economic position. Individuals are concerned about social comparisons. However, behavioral evidence in the field is rare. This paper provides an empirical analysis, testing the model of inequality aversion using two unique panel data sets for basketball and soccer players. We find support that the concept of inequality aversion helps to understand how the relative income situation affects performance in a real competitive environment with real tasks and real incentives.
Resumo:
This thesis is a work of creative practice-led research comprising two components. The first component is a speculative thriller novel, entitled Diamond Eyes. (Contracted for publication in 2009 by Harper Collins: Voyager as the first in a trilogy, under the name AA Bell.) The second component is an exegesis exploring the notion of re-visioning a novel. Re-visioning, not to be confused with revision, refers to advance editing strategies required when the original vision of a novel changes during development.
Resumo:
Current estimates of soil C storage potential are based on models or factors that assume linearity between C input levels and C stocks at steady-state, implying that SOC stocks could increase without limit as C input levels increase. However, some soils show little or no increase in steady-state SOC stock with increasing C input levels suggesting that SOC can become saturated with respect to C input. We used long-term field experiment data to assess alternative hypotheses of soil carbon storage by three simple models: a linear model (no saturation), a one-pool whole-soil C saturation model, and a two-pool mixed model with C saturation of a single C pool, but not the whole soil. The one-pool C saturation model best fit the combined data from 14 sites, four individual sites were best-fit with the linear model, and no sites were best fit by the mixed model. These results indicate that existing agricultural field experiments generally have too small a range in C input levels to show saturation behavior, and verify the accepted linear relationship between soil C and C input used to model SOM dynamics. However, all sites combined and the site with the widest range in C input levels were best fit with the C-saturation model. Nevertheless, the same site produced distinct effective stabilization capacity curves rather than an absolute C saturation level. We conclude that the saturation of soil C does occur and therefore the greatest efficiency in soil C sequestration will be in soils further from C saturation.
Resumo:
The majority of the world’s population now lives in cities (United Nations, 2008) resulting in an urban densification requiring people to live in closer proximity and share urban infrastructure such as streets, public transport, and parks within cities. However, “physical closeness does not mean social closeness” (Wellman, 2001, p. 234). Whereas it is a common practice to greet and chat with people you cross paths with in smaller villages, urban life is mainly anonymous and does not automatically come with a sense of community per se. Wellman (2001, p. 228) defines community “as networks of interpersonal ties that provide sociability, support, information, a sense of belonging and social identity.” While on the move or during leisure time, urban dwellers use their interactive information communication technology (ICT) devices to connect to their spatially distributed community while in an anonymous space. Putnam (1995) argues that available technology privatises and individualises the leisure time of urban dwellers. Furthermore, ICT is sometimes used to build a “cocoon” while in public to avoid direct contact with collocated people (Mainwaring et al., 2005; Bassoli et al., 2007; Crawford, 2008). Instead of using ICT devices to seclude oneself from the surrounding urban environment and the collocated people within, such devices could also be utilised to engage urban dwellers more with the urban environment and the urban dwellers within. Urban sociologists found that “what attracts people most, it would appear, is other people” (Whyte, 1980, p. 19) and “people and human activity are the greatest object of attention and interest” (Gehl, 1987, p. 31). On the other hand, sociologist Erving Goffman describes the concept of civil inattention, acknowledging strangers’ presence while in public but not interacting with them (Goffman, 1966). With this in mind, it appears that there is a contradiction between how people are using ICT in urban public places and for what reasons and how people use public urban places and how they behave and react to other collocated people. On the other hand there is an opportunity to employ ICT to create and influence experiences of people collocated in public urban places. The widespread use of location aware mobile devices equipped with Internet access is creating networked localities, a digital layer of geo-coded information on top of the physical world (Gordon & de Souza e Silva, 2011). Foursquare.com is an example of a location based 118 Mobile Multimedia – User and Technology Perspectives social network (LBSN) that enables urban dwellers to virtually check-in into places at which they are physically present in an urban space. Users compete over ‘mayorships’ of places with Foursquare friends as well as strangers and can share recommendations about the space. The research field of Urban Informatics is interested in these kinds of digital urban multimedia augmentations and how such augmentations, mediated through technology, can create or influence the UX of public urban places. “Urban informatics is the study, design, and practice of urban experiences across different urban contexts that are created by new opportunities of real-time, ubiquitous technology and the augmentation that mediates the physical and digital layers of people networks and urban infrastructures” (Foth et al., 2011, p. 4). One possibility to augment the urban space is to enable citizens to digitally interact with spaces and urban dwellers collocated in the past, present, and future. “Adding digital layer to the existing physical and social layers could facilitate new forms of interaction that reshape urban life” (Kjeldskov & Paay, 2006, p. 60). This methodological chapter investigates how the design of UX through such digital placebased mobile multimedia augmentations can be guided and evaluated. First, we describe three different applications that aim to create and influence the urban UX through mobile mediated interactions. Based on a review of literature, we describe how our integrated framework for designing and evaluating urban informatics experiences has been constructed. We conclude the chapter with a reflective discussion on the proposed framework.
Resumo:
For more than a decade research in the field of context aware computing has aimed to find ways to exploit situational information that can be detected by mobile computing and sensor technologies. The goal is to provide people with new and improved applications, enhanced functionality and better use experience (Dey, 2001). Early applications focused on representing or computing on physical parameters, such as showing your location and the location of people or things around you. Such applications might show where the next bus is, which of your friends is in the vicinity and so on. With the advent of social networking software and microblogging sites such as Facebook and Twitter, recommender systems and so on context-aware computing is moving towards mining the social web in order to provide better representations and understanding of context, including social context. In this paper we begin by recapping different theoretical framings of context. We then discuss the problem of context- aware computing from a design perspective.
Resumo:
New knowledge has raised a concern about the cost-ineffective design methods and the true performance of railroad prestressed concrete ties. Because of previous knowledge deficiencies, railway civil and track engineers have been aware of the conservative design methods for structural components in any railway track that rely on allowable stresses and material strength reductions. In particular, railway sleeper (or railroad tie) is an important component of railway tracks and is commonly made of prestressed concrete. The existing code for designing such components makes use of the permissible stress design concept, whereas the fiber stresses over cross sections at initial and final stages are limited by some empirical values. It is believed that the concrete ties complying with the permissible stress concept possess unduly untapped fracture toughness, based on a number of proven experiments and field data. Collaborative research run by the Australian Cooperative Research Centre for Railway Engineering and Technologies (Rail CRC) was initiated to ascertain the reserved capacity of Australian railway prestressed concrete ties that were designed using the existing design code. The findings have led to the development of a new limit-states design concept. This paper highlights the conventional and the new limit-states design philosophies and their implication to both the railway community and the public. © 2011 American Society of Civil Engineers.
Resumo:
Universities in Australia and elsewhere have changed considerably in recent years. Inevitably, this has meant that the work of academics has also changed. Academics’ work is of importance because they are key players in universities and universities matter to the nation economically and intellectually in advancing knowledge and its practical application. Through the changes and challenges that have characterised academia in recent years, there is an assumption that academics’ work is representative of a profession. This research study investigates how academics construct their own perspectives regarding the academic "profession". The study is theoretically informed by Freidson’s theory that conceptualises professions as occupations if they are in control of their work rather than it being under the control of either the market or of their employing institutions. Two research questions guide this study. The first question investigates how academics might construct their work in ideal terms and the second one investigates the extent to which such constructions might constitute a "profession". A qualitative case study was conducted within two Australian universities. In all, twenty academics from ten disciplines took part in the study that consisted of a focus group and fifteen individual interviews. The study was conducted in three phases during which a conceptual framework of academics’ work was developed across three versions. This framework acted both a prompt to discussion and as a potential expression of academics’ work. The first version of the framework was developed from the literature during the first phase of the study. This early framework was used during the second phase of the study when five academics took part in a focus group. After the focus group, the second version of the framework was developed and used with fifteen academics in individual interviews during phase three of the study. The third version of the framework was the outcome of a synthesis of the themes that were identified in the data. The discussion data from the focus group and the individual interviews were analysed through a content analysis approach that identified four major themes. The first theme was that academics reported that their work would ideally be located within universities committed to using their expert knowledge to serve the world. The second theme was that academics reported that they wanted sufficient thinking time and reasonable workloads to undertake the intellectual work that they regard as their core responsibility, particularly in relation to undertaking research. They argued against heavy routine administrative workloads and sought a continuation of current flexible working arrangements. The third theme was that teaching qualifications should not be mandated but that there should be a continuation of the present practice of universities offering academics the opportunity to undertake formal teaching qualifications if they wish to. Finally, academics reported that they wanted values that have traditionally mattered to academia to continue to be respected and practised: autonomy, collegiality and collaborative relationships, altruism and service, and intellectual integrity. These themes are sympathetic to Freidson’s theory of professions in all but one matter: the non-mandatory nature of formal qualifications which he regards as absolutely essential for the performance of the complex intellectual work that characterises occupations that are professions. The study places the issue of academic professionalism on the policy agenda for universities wishing to identify academics’ work as a profession. The study contributes a theory-based and data-informed conceptual framework for academics’ work that can be considered in negotiating the nature and extent of their work. The framework provides a means of analysing what "academic professionalism" might mean; it adds specificity to such discussions by exploring a particular definition of profession, namely Freidson’s theory of professions as occupations that are in control of their own work. The study contributes to the development of theories around higher education concepts of academic professionalism and, in so doing, links that theoretical contribution to the wider professions field.
Resumo:
Purpose – The purpose of this paper is to provide a summary description of the doctoral thesis investigating the field of project management (PM) deployment. Researchers will be informed of the current contributions within this topic and of the possible further investigations and researches. The decision makers and practitioners will be aware of a set of tools addressing the PM deployment with new perspectives. Design/methodology/approach – Research undertaken with the thesis is based on quantitative methods using time series statistics (time distance analysis) and comparative and correlation analysis aimed to better define and understand the PM deployment within and between countries or groups. Findings – The results suggest a project management deployment index (PMDI) to objectively measure the PM deployment based on the concept of certification. A proposed framework to empirically benchmark the PM deployment between countries by integrating the PMDI time series with the two dimensional comparative analysis of Sicherl. The correlation analysis within Hoftsede cultural framework shows the impact of the national culture dimensions on the PM deployment. The forecasting model shows a general continual growth trend of the PM deployment, with continual increase in the time distance between the countries. Research limitations/implications – The PM researchers are offered an empirical quantification on which they can construct further investigations and understanding of this phenomenon. The number of possible units that can be studied offers wide possibilities to replicate the thesis work. New researches can be undertaken to investigate further the contribution of other social or economical indicators, or to refine and enrich the definition of the PMDI indicator. Practical implications – These results have important implications on the PM deployment approaches. The PMDI measurements and time series comparisons facilitate considerably the measurement and benchmarking between the units (e.g. countries) and against targets, while the readiness setting of the studied unit (in terms of development and cultural levels) impacts the PM deployment within this country. Originality/value – This paper provides a summary of cutting-edge research work in the studied field of PM deployment and a link to the published works that researchers can use to help them understand the thesis research as well as how it can be extended.
Resumo:
This study explores the impact of field experience in Australian primary classrooms on the developing professional identities of Malaysian pre-service teachers. This group of 24 Malaysian students are undertaking their Bachelor of Education in Teaching English as a Foreign Language (BEd TESL) at an Australian university, as part of a transnational twinning program. The globalisation of education has seen an increase in such transnational school experiences for pre-service teachers, with the aim of extending professional experience and intercultural competence by engaging in communities of practice beyond the local (Tsui 2005, Luke 2004). Despite overseas governments, such as Malaysia, having sponsored multimillion dollar twinning programs for their pre-service teachers, there is a lack of research regarding the outcomes of transnational professional practice within such programs. This study adopts a qualitative approach focusing on participants’ narratives as revealed in their reflective writing and through semi-structured interviews. Adopting a Bakhtinian framework, this research uses the concept of ‘voice’ to explore how pre-service teachers negotiate their identities as EFL teachers in response to their lived professional experiences (Bakhtin 1981, 1986). Encountering different cultural and educational practices in their transnational field experiences can lead pre-service teachers to question taken-for-granted practices that they have grown up with. This has been described as a process of making the familiar strange, and can lead to a shift in professional understandings. This study investigates how such questioning occurs and how the transnational field experience is perceived by the participants as contributing to their developing professional identities.
Resumo:
The concept of market-driven rather than product-driven quality management has been given prominence through the report of a recent inquiry into the performance of the Hong Kong construction industry. The report submitted to the Government of Hong Kong in 2001 establishes a new vision of ‘an integrated industry that is capable of continuous improvement towards excellence in the market-driven environment’. Given the current economic downturn, major contractors are facing many challenges to realize this new quality oriented vision. This paper addresses the critical and timely issue of applying quality management to the project delivery process in Hong Kong. The paper attempts to capture and critically examine management perceptions of quality management aspects as applied to a local large-scale road construction project. Based on the analysis of questionnaire feedback and face-to-face interviews, the paper reveals key attributes of a successful application of quality management approaches, and identifies a mechanism for facilitating such implementation.
Resumo:
Purpose This work introduces the concept of very small field size. Output factor (OPF) measurements at these field sizes require extremely careful experimental methodology including the measurement of dosimetric field size at the same time as each OPF measurement. Two quantifiable scientific definitions of the threshold of very small field size are presented. Methods A practical definition was established by quantifying the effect that a 1 mm error in field size or detector position had on OPFs, and setting acceptable uncertainties on OPF at 1%. Alternatively, for a theoretical definition of very small field size, the OPFs were separated into additional factors to investigate the specific effects of lateral electronic disequilibrium, photon scatter in the phantom and source occlusion. The dominant effect was established and formed the basis of a theoretical definition of very small fields. Each factor was obtained using Monte Carlo simulations of a Varian iX linear accelerator for various square field sizes of side length from 4 mm to 100 mm, using a nominal photon energy of 6 MV. Results According to the practical definition established in this project, field sizes < 15 mm were considered to be very small for 6 MV beams for maximal field size uncertainties of 1 mm. If the acceptable uncertainty in the OPF was increased from 1.0 % to 2.0 %, or field size uncertainties are 0.5 mm, field sizes < 12 mm were considered to be very small. Lateral electronic disequilibrium in the phantom was the dominant cause of change in OPF at very small field sizes. Thus the theoretical definition of very small field size coincided to the field size at which lateral electronic disequilibrium clearly caused a greater change in OPF than any other effects. This was found to occur at field sizes < 12 mm. Source occlusion also caused a large change in OPF for field sizes < 8 mm. Based on the results of this study, field sizes < 12 mm were considered to be theoretically very small for 6 MV beams. Conclusions Extremely careful experimental methodology including the measurement of dosimetric field size at the same time as output factor measurement for each field size setting and also very precise detector alignment is required at field sizes at least < 12 mm and more conservatively < 15 mm for 6 MV beams. These recommendations should be applied in addition to all the usual considerations for small field dosimetry, including careful detector selection.
Resumo:
Many model-based investigation techniques, such as sensitivity analysis, optimization, and statistical inference, require a large number of model evaluations to be performed at different input and/or parameter values. This limits the application of these techniques to models that can be implemented in computationally efficient computer codes. Emulators, by providing efficient interpolation between outputs of deterministic simulation models, can considerably extend the field of applicability of such computationally demanding techniques. So far, the dominant techniques for developing emulators have been priors in the form of Gaussian stochastic processes (GASP) that were conditioned with a design data set of inputs and corresponding model outputs. In the context of dynamic models, this approach has two essential disadvantages: (i) these emulators do not consider our knowledge of the structure of the model, and (ii) they run into numerical difficulties if there are a large number of closely spaced input points as is often the case in the time dimension of dynamic models. To address both of these problems, a new concept of developing emulators for dynamic models is proposed. This concept is based on a prior that combines a simplified linear state space model of the temporal evolution of the dynamic model with Gaussian stochastic processes for the innovation terms as functions of model parameters and/or inputs. These innovation terms are intended to correct the error of the linear model at each output step. Conditioning this prior to the design data set is done by Kalman smoothing. This leads to an efficient emulator that, due to the consideration of our knowledge about dominant mechanisms built into the simulation model, can be expected to outperform purely statistical emulators at least in cases in which the design data set is small. The feasibility and potential difficulties of the proposed approach are demonstrated by the application to a simple hydrological model.
Resumo:
To obtain accurate Monte Carlo simulations of small radiation fields, it is important model the initial source parameters (electron energy and spot size) accurately. However recent studies have shown that small field dosimetry correction factors are insensitive to these parameters. The aim of this work is to extend this concept to test if these parameters affect dose perturbations in general, which is important for detector design and calculating perturbation correction factors. The EGSnrc C++ user code cavity was used for all simulations. Varying amounts of air between 0 and 2 mm were deliberately introduced upstream to a diode and the dose perturbation caused by the air was quantified. These simulations were then repeated using a range of initial electron energies (5.5 to 7.0 MeV) and electron spot sizes (0.7 to 2.2 FWHM). The resultant dose perturbations were large. For example 2 mm of air caused a dose reduction of up to 31% when simulated with a 6 mm field size. However these values did not vary by more than 2 % when simulated across the full range of source parameters tested. If a detector is modified by the introduction of air, one can be confident that the response of the detector will be the same across all similar linear accelerators and the Monte Carlo modelling of each machine is not required.
Resumo:
Engineers must have deep and accurate conceptual understanding of their field and Concept inventories (CIs) are one method of assessing conceptual understanding and providing formative feedback. Current CI tests use Multiple Choice Questions (MCQ) to identify misconceptions and have undergone reliability and validity testing to assess conceptual understanding. However, they do not readily provide the diagnostic information about students’ reasoning and therefore do not effectively point to specific actions that can be taken to improve student learning. We piloted the textual component of our diagnostic CI on electrical engineering students using items from the signals and systems CI. We then analysed the textual responses using automated lexical analysis software to test the effectiveness of these types of software and interviewed the students regarding their experience using the textual component. Results from the automated text analysis revealed that students held both incorrect and correct ideas for certain conceptual areas and provided indications of student misconceptions. User feedback also revealed that the inclusion of the textual component is helpful to students in assessing and reflecting on their own understanding.
Resumo:
This thesis investigated in detail the physics of small X-ray fields used in radiotherapy treatments. Because of this work, the ability to accurately measure dose from these very small X-ray fields has been improved in several ways. These include scientifically quantifying when highly accurate measurements are required by introducing the concept of a very small field, and by the invention of a new detector that responds the same in very small fields as in normal fields.