634 resultados para Multiple testing
Destination brand equity for Australia : testing a model of CBBE in short haul and long haul markets
Resumo:
The study of destination brand performance measurement has only emerged in earnest as a field in the tourism literature since 2007. The concept of consumer-based brand equity (CBBE) is gaining favour from services marketing researchers as an alternative to the traditional ‘net-present-value of future earnings’ method of measuring brand equity. The perceptions-based CBBE model also appears suitable for examining destination brand performance, where a financial brand equity valuation on a destination marketing organisation’s (DMO) balance sheet is largely irrelevant. This is the first study to test and compare the model in both short and long haul markets. The paper reports the results of tests of a CBBE model for Australia in a traditional short haul market (New Zealand) and an emerging long haul market (Chile). The data from both samples indicated destination brand salience, brand image, and brand value are positively related to purchase intent for Australia in these two disparate markets.
Resumo:
This paper looks at the accuracy of using the built-in camera of smart phones and free software as an economical way to quantify and analyse light exposure by producing luminance maps from High Dynamic Range (HDR) images. HDR images were captured with an Apple iPhone 4S to capture a wide variation of luminance within an indoor and outdoor scene. The HDR images were then processed using Photosphere software (Ward, 2010.) to produce luminance maps, where individual pixel values were compared with calibrated luminance meter readings. This comparison has shown an average luminance error of ~8% between the HDR image pixel values and luminance meter readings, when the range of luminances in the image is limited to approximately 1,500cd/m2.
Resumo:
The Theory of the Growth of The Firm by Edith Penrose, first published in 1959, is a seminal contribution to the field of management. Penrose's intention was to create a theory of firm growth which was logically consistent and empirically tractable (Buckley and Casson, 2007). Much attention, however, has been focused on her unintended contribution to the resource-based view (henceforth RBV) (e.g. Kor and Mahoney, 2004; Lockett and Thompson, 2004) rather than her firm growth theory. We feel that this is unfortunate because despite a rapidly growing body of empirical work, conceptual advancement in growth studies has been limited (Davidsson and Wiklund, 2000; Davidsson et ai., 2006; Delmar, 1997; Storey, 1994). The growth literature frequently references Penrose's work, but little explicit testing of her ideas has been undertaken. This is surprising given that Penrose's work remains the most comprehensive theory of growth to date. One explanation is that she did not formality present her arguments, favouring verbal exposition over formalized models (Lockett, 2005; Lockett and Thompson, 2004). However, the central propositions and conclusions of her theory can be operationalized and empirically tested.
Resumo:
Background: Intra-amniotic infection accounts for 30% of all preterm births (PTB), with the human Ureaplasma species being the most frequently identified microorganism from the placentas of women who deliver preterm. The highest prevalence of PTB occurs late preterm (32-36 weeks) but no studies have investigated the role of infectious aetiologies associated with late preterm birth. Method: Placentas from women with late PTB were dissected aseptically and samples of chorioamnion tissue and membrane swabs were collected. These were tested for Ureaplasma spp. and aerobic/anaerobic bacteria by culture and real-time PCR. Western blot was used to assess MBA variation in ureaplasma clinical isolates. The presence of microorganisms was correlated with histological chorioamnionitis. Results: Ureaplasma spp. were isolated from 33/466 (7%) of placentas by culture or PCR. The presence of ureaplasmas, but not other microorganisms, was associated with histological chorioamnionitis (21/33 ureaplasma-positive vs. 8/42 other bacteria; p= 0.001). Ureaplasma clinical isolates demonstrating no MBA variation were associated with histological chorioamnionitis. By contrast, ureaplasmas displaying MBA variation were isolated from placentas with no significant histological chorioamnionitis (p= 0.001). Conclusion: Ureaplasma spp. within placentas delivered late preterm (7%) is associated with histological chorioamnionitis (p = 0.001). Decreased inflammation within chorioamnion was observed when the clinical ureaplasma isolates demonstrated variation of their surface-exposed lipoproteins (MBA). This variation may be a mechanism by which ureaplasmas modulate and evade the host immune response. So whilst ureaplasmas are present intra-amniotically they are not suspected because of the normal macroscopic appearance of the placentas and the amniotic fluid.
Resumo:
This thesis examines the social practice of homework. It explores how homework is shaped by the discourses, policies and guidelines in circulation in a society at any given time with particular reference to one school district in the province of Newfoundland and Labrador, Canada. This study investigates how contemporary homework reconstitutes the home as a pedagogical site where the power of the institution of schooling circulates regularly from school to home. It examines how the educational system shapes the organization of family life and how family experiences with homework may be different in different sites depending on the accessibility of various forms of cultural capital. This study employs a qualitative approach, incorporating multiple case studies, and is complemented by insights from institutional ethnography and critical discourse analysis. It draws on the theoretical concepts of Foucault including power and power relations, and governmentality and surveillance, as well as Bourdieu’s concepts of economic, social and cultural capital for analysis. It employs concepts from Bourdieu’s work as they have been expanded on by researchers including Reay (1998), Lareau (2000), and Griffith and Smith (2005). The studies of these researchers allowed for an examination of homework as it related to families and mothers’ work. Smith’s (1987; 1999) concepts of ruling relations, mothers’ unpaid labour, and the engine of inequality were also employed in the analysis. Family interviews with ten volunteer families, teacher focus group sessions with 15 teachers from six schools, homework artefacts, school newsletters, homework brochures, and publicly available assessment and evaluation policy documents from one school district were analyzed. From this analysis key themes emerged and the findings are documented throughout five data analysis chapters. This study shows a change in education in response to a system shaped by standards, accountability and testing. It documents an increased transference of educational responsibility from one educational stakeholder to another. This transference of responsibility shifts downward until it eventually reaches the family in the form of homework and educational activities. Texts in the form of brochures and newsletters, sent home from school, make available to parents specific subject positions that act as instruments of normalization. These subject positions promote a particular ‘ideal’ family that has access to certain types of cultural capital needed to meet the school’s expectations. However, the study shows that these resources are not equally available to all and some families struggle to obtain what is necessary to complete educational activities in the home. The increase in transference of educational work from the school to the home results in greater work for parents, particularly mothers. As well, consideration is given to mother’s role in homework and how, in turn, classroom instructional practices are sometimes dependent on the work completed at home with differential effects for children. This study confirms previous findings that it is mothers who assume the greatest role in the educational trajectory of their children. An important finding in this research is that it is not only middle-class mothers who dedicate extensive time working hard to ensure their children’s educational success; working-class mothers also make substantial contributions of time and resources to their children’s education. The assignments and educational activities distributed as homework require parents’ knowledge of technical school pedagogy to help their children. Much of the homework being sent home from schools is in the area of literacy, particularly reading, but requires parents to do more than read with children. A key finding is that the practices of parents are changing and being reconfigured by the expectations of schools in regard to reading. Parents are now being required to monitor and supervise children’s reading, as well as help children complete reading logs, written reading responses, and follow up questions. The reality of family life as discussed by the participants in this study does not match the ‘ideal’ as portrayed in the educational documents. Homework sessions often create frustrations and tensions between parents and children. Some of the greatest struggles for families were created by mathematical homework, homework for those enrolled in the French Immersion program, and the work required to complete Literature, Heritage and Science Fair projects. Even when institutionalized and objectified capital was readily available, many families still encountered struggles when trying to carry out the assigned educational tasks. This thesis argues that homework and education-related activities play out differently in different homes. Consideration of this significance may assist educators to better understand and appreciate the vast difference in families and the ways in which each family can contribute to their children’s educational trajectory.
Resumo:
As the world’s population is growing, so is the demand for agricultural products. However, natural nitrogen (N) fixation and phosphorus (P) availability cannot sustain the rising agricultural production, thus, the application of N and P fertilisers as additional nutrient sources is common. It is those anthropogenic activities that can contribute high amounts of organic and inorganic nutrients to both surface and groundwaters resulting in degradation of water quality and a possible reduction of aquatic life. In addition, runoff and sewage from urban and residential areas can contain high amounts of inorganic and organic nutrients which may also affect water quality. For example, blooms of the cyanobacterium Lyngbya majuscula along the coastline of southeast Queensland are an indicator of at least short term decreases of water quality. Although Australian catchments, including those with intensive forms of land use, show in general a low export of nutrients compared to North American and European catchments, certain land use practices may still have a detrimental effect on the coastal environment. Numerous studies are reported on nutrient cycling and associated processes on a catchment scale in the Northern Hemisphere. Comparable studies in Australia, in particular in subtropical regions are, however, limited and there is a paucity in the data, in particular for inorganic and organic forms of nitrogen and phosphorus; these nutrients are important limiting factors in surface waters to promote algal blooms. Therefore, the monitoring of N and P and understanding the sources and pathways of these nutrients within a catchment is important in coastal zone management. Although Australia is the driest continent, in subtropical regions such as southeast Queensland, rainfall patterns have a significant effect on runoff and thus the nutrient cycle at a catchment scale. Increasingly, these rainfall patterns are becoming variable. The monitoring of these climatic conditions and the hydrological response of agricultural catchments is therefore also important to reduce the anthropogenic effects on surface and groundwater quality. This study consists of an integrated hydrological–hydrochemical approach that assesses N and P in an environment with multiple land uses. The main aim is to determine the nutrient cycle within a representative coastal catchment in southeast Queensland, the Elimbah Creek catchment. In particular, the investigation confirms the influence associated with forestry and agriculture on N and P forms, sources, distribution and fate in the surface and groundwaters of this subtropical setting. In addition, the study determines whether N and P are subject to transport into the adjacent estuary and thus into the marine environment; also considered is the effect of local topography, soils and geology on N and P sources and distribution. The thesis is structured on four components individually reported. The first paper determines the controls of catchment settings and processes on stream water, riverbank sediment, and shallow groundwater N and P concentrations, in particular during the extended dry conditions that were encountered during the study. Temporal and spatial factors such as seasonal changes, soil character, land use and catchment morphology are considered as well as their effect on controls over distributions of N and P in surface waters and associated groundwater. A total number of 30 surface and 13 shallow groundwater sampling sites were established throughout the catchment to represent dominant soil types and the land use upstream of each sampling location. Sampling comprises five rounds and was conducted over one year between October 2008 and November 2009. Surface water and groundwater samples were analysed for all major dissolved inorganic forms of N and for total N. Phosphorus was determined in the form of dissolved reactive P (predominantly orthophosphate) and total P. In addition, extracts of stream bank sediments and soil grab samples were analysed for these N and P species. Findings show that major storm events, in particular after long periods of drought conditions, are the driving force of N cycling. This is expressed by higher inorganic N concentrations in the agricultural subcatchment compared to the forested subcatchment. Nitrate N is the dominant inorganic form of N in both the surface and groundwaters and values are significantly higher in the groundwaters. Concentrations in the surface water range from 0.03 to 0.34 mg N L..1; organic N concentrations are considerably higher (average range: 0.33 to 0.85 mg N L..1), in particular in the forested subcatchment. Average NO3-N in the groundwater has a range of 0.39 to 2.08 mg N L..1, and organic N averages between 0.07 and 0.3 mg N L..1. The stream bank sediments are dominated by organic N (range: 0.53 to 0.65 mg N L..1), and the dominant inorganic form of N is NH4-N with values ranging between 0.38 and 0.41 mg N L..1. Topography and soils, however, were not to have a significant effect on N and P concentrations in waters. Detectable phosphorus in the surface and groundwaters of the catchment is limited to several locations typically in the proximity of areas with intensive animal use; in soil and sediments, P is negligible. In the second paper, the stable isotopes of N (14N/15N) and H2O (16O/18O and 2H/H) in surface and groundwaters are used to identify sources of dissolved inorganic and organic N in these waters, and to determine their pathways within the catchment; specific emphasis is placed on the relation of forestry and agriculture. Forestry is predominantly concentrated in the northern subcatchment (Beerburrum Creek) while agriculture is mainly found in the southern subcatchment (Six Mile Creek). Results show that agriculture (horticulture, crops, grazing) is the main source of inorganic N in the surface waters of the agricultural subcatchment, and their isotopic signature shows a close link to evaporation processes that may occur during water storage in farm dams that are used for irrigation. Groundwaters are subject to denitrification processes that may result in reduced dissolved inorganic N concentrations. Soil organic matter delivers most of the inorganic N to the surface water in the forested subcatchment. Here, precipitation and subsequently runoff is the main source of the surface waters. Groundwater in this area is affected by agricultural processes. The findings also show that the catchment can attenuate the effects of anthropogenic land use on surface water quality. Riparian strips of natural remnant vegetation, commonly 50 to 100 m in width, act as buffer zones along the drainage lines in the catchment and remove inorganic N from the soil water before it enters the creek. These riparian buffer zones are common in most agricultural catchments of southeast Queensland and are indicated to reduce the impact of agriculture on stream water quality and subsequently on the estuary and marine environments. This reduction is expressed by a significant decrease in DIN concentrations from 1.6 mg N L..1 to 0.09 mg N L..1, and a decrease in the �15N signatures from upstream surface water locations downstream to the outlet of the agricultural subcatchment. Further testing is, however, necessary to confirm these processes. Most importantly, the amount of N that is transported to the adjacent estuary is shown to be negligible. The third and fourth components of the thesis use a hydrological catchment model approach to determine the water balance of the Elimbah Creek catchment. The model is then used to simulate the effects of land use on the water balance and nutrient loads of the study area. The tool that is used is the internationally widely applied Soil and Water Assessment Tool (SWAT). Knowledge about the water cycle of a catchment is imperative in nutrient studies as processes such as rainfall, surface runoff, soil infiltration and routing of water through the drainage system are the driving forces of the catchment nutrient cycle. Long-term information about discharge volumes of the creeks and rivers do, however, not exist for a number of agricultural catchments in southeast Queensland, and such information is necessary to calibrate and validate numerical models. Therefore, a two-step modelling approach was used to calibrate and validate parameters values from a near-by gauged reference catchment as starting values for the ungauged Elimbah Creek catchment. Transposing monthly calibrated and validated parameter values from the reference catchment to the ungauged catchment significantly improved model performance showing that the hydrological model of the catchment of interest is a strong predictor of the water water balance. The model efficiency coefficient EF shows that 94% of the simulated discharge matches the observed flow whereas only 54% of the observed streamflow was simulated by the SWAT model prior to using the validated values from the reference catchment. In addition, the hydrological model confirmed that total surface runoff contributes the majority of flow to the surface water in the catchment (65%). Only a small proportion of the water in the creek is contributed by total base-flow (35%). This finding supports the results of the stable isotopes 16O/18O and 2H/H, which show the main source of water in the creeks is either from local precipitation or irrigation waters delivered by surface runoff; a contribution from the groundwater (baseflow) to the creeks could not be identified using 16O/18O and 2H/H. In addition, the SWAT model calculated that around 68% of the rainfall occurring in the catchment is lost through evapotranspiration reflecting the prevailing long-term drought conditions that were observed prior and during the study. Stream discharge from the forested subcatchment was an order of magnitude lower than discharge from the agricultural Six Mile Creek subcatchment. A change in land use from forestry to agriculture did not significantly change the catchment water balance, however, nutrient loads increased considerably. Conversely, a simulated change from agriculture to forestry resulted in a significant decrease of nitrogen loads. The findings of the thesis and the approach used are shown to be of value to catchment water quality monitoring on a wider scale, in particular the implications of mixed land use on nutrient forms, distributions and concentrations. The study confirms that in the tropics and subtropics the water balance is affected by extended dry periods and seasonal rainfall with intensive storm events. In particular, the comprehensive data set of inorganic and organic N and P forms in the surface and groundwaters of this subtropical setting acquired during the one year sampling program may be used in similar catchment hydrological studies where these detailed information is missing. Also, the study concludes that riparian buffer zones along the catchment drainage system attenuate the transport of nitrogen from agricultural sources in the surface water. Concentrations of N decreased from upstream to downstream locations and were negligible at the outlet of the catchment.
Resumo:
Given global demand for new infrastructure, governments face substantial challenges in funding new infrastructure and delivering Value for Money (VfM). As part of the background to this challenge, a critique is given of current practice in the selection of the approach to procure major public sector infrastructure in Australia and which is akin to the Multi-Attribute Utility Approach (MAUA). To contribute towards addressing the key weaknesses of MAUA, a new first-order procurement decision-making model is presented. The model addresses the make-or-buy decision (risk allocation); the bundling decision (property rights incentives), as well as the exchange relationship decision (relational to arms-length exchange) in its novel approach to articulating a procurement strategy designed to yield superior VfM across the whole life of the asset. The aim of this paper is report on the development of this decisionmaking model in terms of the procedural tasks to be followed and the method being used to test the model. The planned approach to testing the model uses a sample of 87 Australian major infrastructure projects in the sum of AUD32 billion and deploys a key proxy for VfM comprising expressions of interest, as an indicator of competition.
Resumo:
BACKGROUND: Infection by dengue virus (DENV) is a major public health concern in hundreds of tropical and subtropical countries. French Polynesia (FP) regularly experiences epidemics that initiate, or are consecutive to, DENV circulation in other South Pacific Island Countries (SPICs). In January 2009, after a decade of serotype 1 (DENV-1) circulation, the first cases of DENV-4 infection were reported in FP. Two months later a new epidemic emerged, occurring about 20 years after the previous circulation of DENV-4 in FP. In this study, we investigated the epidemiological and molecular characteristics of the introduction, spread and genetic microevolution of DENV-4 in FP. METHODOLOGY/PRINCIPAL FINDINGS: Epidemiological data suggested that recent transmission of DENV-4 in FP started in the Leeward Islands and this serotype quickly displaced DENV-1 throughout FP. Phylogenetic analyses of the nucleotide sequences of the envelope (E) gene of 64 DENV-4 strains collected in FP in the 1980s and in 2009-2010, and some additional strains from other SPICs showed that DENV-4 strains from the SPICs were distributed into genotypes IIa and IIb. Recent FP strains were distributed into two clusters, each comprising viruses from other but distinct SPICs, suggesting that emergence of DENV-4 in FP in 2009 resulted from multiple introductions. Otherwise, we observed that almost all strains collected in the SPICs in the 1980s exhibit an amino acid (aa) substitution V287I within domain I of the E protein, and all recent South Pacific strains exhibit a T365I substitution within domain III. CONCLUSIONS/SIGNIFICANCE: This study confirmed the cyclic re-emergence and displacement of DENV serotypes in FP. Otherwise, our results showed that specific aa substitutions on the E protein were present on all DENV-4 strains circulating in SPICs. These substitutions probably acquired and subsequently conserved could reflect a founder effect to be associated with epidemiological, geographical, eco-biological and social specificities in SPICs.
Resumo:
Mapping Multiple Literacies brings together the latest theory and research in the fields of literacy study and European philosophy, Multiple Literacies Theory (MLT) and the philosophical work of Gilles Deleuze. It frames the process of becoming literate as a fluid process involving multiple modes of presentation, and explains these processes in terms of making maps of our social lives and ways of doing things together. For Deleuze, language acquisition is a social activity of which we are a part, but only one part amongst many others. Masny and Cole draw on Deleuze's thinking to expand the repertoires of literacy research and understanding. They outline how we can understand literacy as a social activity and map the ways in which becoming literate may take hold and transform communities. The chapters in this book weave together theory, data and practice to open up a creative new area of literacy studies and to provoke vigorous debate about the sociology of literacy.
Resumo:
This thematic issue on education and the politics of becoming focuses on how a Multiple Literacies Theory (MLT) plugs into practice in education. MLT does this by creating an assemblage between discourse, text, resonance and sensations. What does this produce? Becoming AND how one might live are the product of an assemblage (May, 2005; Semetsky, 2003). In this paper, MLT is the approach that explores the connection between educational theory and practice through the lens of an empirical study of multilingual children acquiring multiple writing systems simultaneously. The introduction explicates discourse, text, resonance, sensation and becoming. The second section introduces certain Deleuzian concepts that plug into MLT. The third section serves as an introduction to MLT. The fourth section is devoted to the study by way of a rhizoanalysis. Finally, drawing on the concept of the rhizome, this article exits with potential lines of flight opened by MLT. These are becomings which highlight the significance of this work in terms of transforming not only how literacies are conceptualized, especially in minority language contexts, but also how one might live.
Resumo:
Global Navigation Satellite Systems (GNSS)-based observation systems can provide high precision positioning and navigation solutions in real time, in the order of subcentimetre if we make use of carrier phase measurements in the differential mode and deal with all the bias and noise terms well. However, these carrier phase measurements are ambiguous due to unknown, integer numbers of cycles. One key challenge in the differential carrier phase mode is to fix the integer ambiguities correctly. On the other hand, in the safety of life or liability-critical applications, such as for vehicle safety positioning and aviation, not only is high accuracy required, but also the reliability requirement is important. This PhD research studies to achieve high reliability for ambiguity resolution (AR) in a multi-GNSS environment. GNSS ambiguity estimation and validation problems are the focus of the research effort. Particularly, we study the case of multiple constellations that include initial to full operations of foreseeable Galileo, GLONASS and Compass and QZSS navigation systems from next few years to the end of the decade. Since real observation data is only available from GPS and GLONASS systems, the simulation method named Virtual Galileo Constellation (VGC) is applied to generate observational data from another constellation in the data analysis. In addition, both full ambiguity resolution (FAR) and partial ambiguity resolution (PAR) algorithms are used in processing single and dual constellation data. Firstly, a brief overview of related work on AR methods and reliability theory is given. Next, a modified inverse integer Cholesky decorrelation method and its performance on AR are presented. Subsequently, a new measure of decorrelation performance called orthogonality defect is introduced and compared with other measures. Furthermore, a new AR scheme considering the ambiguity validation requirement in the control of the search space size is proposed to improve the search efficiency. With respect to the reliability of AR, we also discuss the computation of the ambiguity success rate (ASR) and confirm that the success rate computed with the integer bootstrapping method is quite a sharp approximation to the actual integer least-squares (ILS) method success rate. The advantages of multi-GNSS constellations are examined in terms of the PAR technique involving the predefined ASR. Finally, a novel satellite selection algorithm for reliable ambiguity resolution called SARA is developed. In summary, the study demonstrats that when the ASR is close to one, the reliability of AR can be guaranteed and the ambiguity validation is effective. The work then focuses on new strategies to improve the ASR, including a partial ambiguity resolution procedure with a predefined success rate and a novel satellite selection strategy with a high success rate. The proposed strategies bring significant benefits of multi-GNSS signals to real-time high precision and high reliability positioning services.
Resumo:
A procedure for the evaluation of multiple scattering contributions is described, for deep inelastic neutron scattering (DINS) studies using an inverse geometry time-of-flight spectrometer. The accuracy of a Monte Carlo code DINSMS, used to calculate the multiple scattering, is tested by comparison with analytic expressions and with experimental data collected from polythene, polycrystalline graphite and tin samples. It is shown that the Monte Carlo code gives an accurate representation of the measured data and can therefore be used to reliably correct DINS data.
Resumo:
Welcome to the Quality assessment matrix. This matrix is designed for highly qualified discipline experts to evaluate their course, major or unit in a systematic manner. The primary purpose of the Quality assessment matrix is to provide a tool that a group of academic staff at universities can collaboratively review the assessment within a course, major or unit annually. The annual review will result in you being read for an external curricula review at any point in time. This tool is designed for use in a workshop format with one, two or more academic staff, and will lead to an action plan for implementation.
Resumo:
The emergence of pseudo-marginal algorithms has led to improved computational efficiency for dealing with complex Bayesian models with latent variables. Here an unbiased estimator of the likelihood replaces the true likelihood in order to produce a Bayesian algorithm that remains on the marginal space of the model parameter (with latent variables integrated out), with a target distribution that is still the correct posterior distribution. Very efficient proposal distributions can be developed on the marginal space relative to the joint space of model parameter and latent variables. Thus psuedo-marginal algorithms tend to have substantially better mixing properties. However, for pseudo-marginal approaches to perform well, the likelihood has to be estimated rather precisely. This can be difficult to achieve in complex applications. In this paper we propose to take advantage of multiple central processing units (CPUs), that are readily available on most standard desktop computers. Here the likelihood is estimated independently on the multiple CPUs, with the ultimate estimate of the likelihood being the average of the estimates obtained from the multiple CPUs. The estimate remains unbiased, but the variability is reduced. We compare and contrast two different technologies that allow the implementation of this idea, both of which require a negligible amount of extra programming effort. The superior performance of this idea over the standard approach is demonstrated on simulated data from a stochastic volatility model.