547 resultados para Flower shows
Resumo:
Kinoite Ca2Cu2Si3O10(OH)4 is a mineral named after a Jesuit missionary. Raman and infrared spectroscopy have been used to characterise the structure of the mineral. The Raman spectrum is characterised by an intense sharp band at 847 cm-1 assigned to the ν1 (A1g) symmetric stretching vibration. Intense sharp bands at 951, 994 and 1000 cm-1 are assigned to the ν3 (Eu, A2u, B1g) SiO4 antisymmetric stretching vibrations. Multiple ν2 SiO4 vibrational modes indicate strong distortion of the SiO4 tetrahedra. Multiple CaO and CuO stretching bands are observed. Raman spectroscopy confirmed by infrared spectroscopy clearly shows that hydroxyl units are involved in the kinoite structure. Based upon the infrared spectra, it is proposed that water is also involved in the kinoite structure. Based upon vibrational spectroscopy, the formula of kinoite is defined as Ca2Cu2Si3O10(OH)4•xH2O.
Resumo:
Providing effective IT support for business processes has become crucial for enterprises to stay competitive. In response to this need numerous process support paradigms (e.g., workflow management, service flow management, case handling), process specification standards (e.g., WS-BPEL, BPML, BPMN), process tools (e.g., ARIS Toolset, Tibco Staffware, FLOWer), and supporting methods have emerged in recent years. Summarized under the term “Business Process Management” (BPM), these paradigms, standards, tools, and methods have become a success-critical instrument for improving process performance.
Resumo:
This research is one of several ongoing studies conducted within the IT Professional Services (ITPS) research programme at Queensland University of Technology (QUT). In 2003, ITPS introduced the IS-Impact model, a measurement model for measuring information systems success from the viewpoint of multiple stakeholders. The model, along with its instrument, is robust, simple, yet generalisable, and yields results that are comparable across time, stakeholders, different systems and system contexts. The IS-Impact model is defined as “a measure at a point in time, of the stream of net benefits from the Information System (IS), to date and anticipated, as perceived by all key-user-groups”. The model represents four dimensions, which are ‘Individual Impact’, ‘Organizational Impact’, ‘Information Quality’ and ‘System Quality’. The two Impact dimensions measure the up-to-date impact of the evaluated system, while the remaining two Quality dimensions act as proxies for probable future impacts (Gable, Sedera & Chan, 2008). To fulfil the goal of ITPS, “to develop the most widely employed model” this research re-validates and extends the IS-Impact model in a new context. This method/context-extension research aims to test the generalisability of the model by addressing known limitations of the model. One of the limitations of the model relates to the extent of external validity of the model. In order to gain wide acceptance, a model should be consistent and work well in different contexts. The IS-Impact model, however, was only validated in the Australian context, and packaged software was chosen as the IS understudy. Thus, this study is concerned with whether the model can be applied in another different context. Aiming for a robust and standardised measurement model that can be used across different contexts, this research re-validates and extends the IS-Impact model and its instrument to public sector organisations in Malaysia. The overarching research question (managerial question) of this research is “How can public sector organisations in Malaysia measure the impact of information systems systematically and effectively?” With two main objectives, the managerial question is broken down into two specific research questions. The first research question addresses the applicability (relevance) of the dimensions and measures of the IS-Impact model in the Malaysian context. Moreover, this research question addresses the completeness of the model in the new context. Initially, this research assumes that the dimensions and measures of the IS-Impact model are sufficient for the new context. However, some IS researchers suggest that the selection of measures needs to be done purposely for different contextual settings (DeLone & McLean, 1992, Rai, Lang & Welker, 2002). Thus, the first research question is as follows, “Is the IS-Impact model complete for measuring the impact of IS in Malaysian public sector organisations?” [RQ1]. The IS-Impact model is a multidimensional model that consists of four dimensions or constructs. Each dimension is represented by formative measures or indicators. Formative measures are known as composite variables because these measures make up or form the construct, or, in this case, the dimension in the IS-Impact model. These formative measures define different aspects of the dimension, thus, a measurement model of this kind needs to be tested not just on the structural relationship between the constructs but also the validity of each measure. In a previous study, the IS-Impact model was validated using formative validation techniques, as proposed in the literature (i.e., Diamantopoulos and Winklhofer, 2001, Diamantopoulos and Siguaw, 2006, Petter, Straub and Rai, 2007). However, there is potential for improving the validation testing of the model by adding more criterion or dependent variables. This includes identifying a consequence of the IS-Impact construct for the purpose of validation. Moreover, a different approach is employed in this research, whereby the validity of the model is tested using the Partial Least Squares (PLS) method, a component-based structural equation modelling (SEM) technique. Thus, the second research question addresses the construct validation of the IS-Impact model; “Is the IS-Impact model valid as a multidimensional formative construct?” [RQ2]. This study employs two rounds of surveys, each having a different and specific aim. The first is qualitative and exploratory, aiming to investigate the applicability and sufficiency of the IS-Impact dimensions and measures in the new context. This survey was conducted in a state government in Malaysia. A total of 77 valid responses were received, yielding 278 impact statements. The results from the qualitative analysis demonstrate the applicability of most of the IS-Impact measures. The analysis also shows a significant new measure having emerged from the context. This new measure was added as one of the System Quality measures. The second survey is a quantitative survey that aims to operationalise the measures identified from the qualitative analysis and rigorously validate the model. This survey was conducted in four state governments (including the state government that was involved in the first survey). A total of 254 valid responses were used in the data analysis. Data was analysed using structural equation modelling techniques, following the guidelines for formative construct validation, to test the validity and reliability of the constructs in the model. This study is the first research that extends the complete IS-Impact model in a new context that is different in terms of nationality, language and the type of information system (IS). The main contribution of this research is to present a comprehensive, up-to-date IS-Impact model, which has been validated in the new context. The study has accomplished its purpose of testing the generalisability of the IS-Impact model and continuing the IS evaluation research by extending it in the Malaysian context. A further contribution is a validated Malaysian language IS-Impact measurement instrument. It is hoped that the validated Malaysian IS-Impact instrument will encourage related IS research in Malaysia, and that the demonstrated model validity and generalisability will encourage a cumulative tradition of research previously not possible. The study entailed several methodological improvements on prior work, including: (1) new criterion measures for the overall IS-Impact construct employed in ‘identification through measurement relations’; (2) a stronger, multi-item ‘Satisfaction’ construct, employed in ‘identification through structural relations’; (3) an alternative version of the main survey instrument in which items are randomized (rather than blocked) for comparison with the main survey data, in attention to possible common method variance (no significant differences between these two survey instruments were observed); (4) demonstrates a validation process of formative indexes of a multidimensional, second-order construct (existing examples mostly involved unidimensional constructs); (5) testing the presence of suppressor effects that influence the significance of some measures and dimensions in the model; and (6) demonstrates the effect of an imbalanced number of measures within a construct to the contribution power of each dimension in a multidimensional model.
Resumo:
The primary objective of the experiments reported here was to demonstrate the effects of opening up the design envelope for auditory alarms on the ability of people to learn the meanings of a set of alarms. Two sets of alarms were tested, one already extant and one newly-designed set for the same set of functions, designed according to a rationale set out by the authors aimed at increasing the heterogeneity of the alarm set and incorporating some well-established principles of alarm design. For both sets of alarms, a similarity-rating experiment was followed by a learning experiment. The results showed that the newly-designed set was judged to be more internally dissimilar, and easier to learn, than the extant set. The design rationale outlined in the paper is useful for design purposes in a variety of practical domains and shows how alarm designers, even at a relatively late stage in the design process, can improve the efficacy of an alarm set.
Resumo:
Collaborative question answering (cQA) portals such as Yahoo! Answers allow users as askers or answer authors to communicate, and exchange information through the asking and answering of questions in the network. In their current set-up, answers to a question are arranged in chronological order. For effective information retrieval, it will be advantageous to have the users’ answers ranked according to their quality. This paper proposes a novel approach of evaluating and ranking the users’answers and recommending the top-n quality answers to information seekers. The proposed approach is based on a user-reputation method which assigns a score to an answer reflecting its answer author’s reputation level in the network. The proposed approach is evaluated on a dataset collected from a live cQA, namely, Yahoo! Answers. To compare the results obtained by the non-content-based user-reputation method, experiments were also conducted with several content-based methods that assign a score to an answer reflecting its content quality. Various combinations of non-content and content-based scores were also used in comparing results. Empirical analysis shows that the proposed method is able to rank the users’ answers and recommend the top-n answers with good accuracy. Results of the proposed method outperform the content-based methods, various combinations, and the results obtained by the popular link analysis method, HITS.
Resumo:
Thermogravimetry combined with evolved gas mass spectrometry has been used to characterise the mineral ardealite and to ascertain the thermal stability of this ‘cave’ mineral. The mineral ardealite Ca2(HPO4)(SO4)•4H2O is formed through the reaction of calcite with bat guano. The mineral shows disorder and the composition varies depending on the origin of the mineral. Thermal analysis shows that the mineral starts to decompose over the temperature range 100 to 150°C with some loss of water. The critical temperature for water loss is around 215°C and above this temperature the mineral structure is altered. It is concluded that the mineral starts to decompose at 125°C, with all waters of hydration being lost after 226°C. Some loss of sulphate occurs over a broad temperature range centred upon 565°C. The final decomposition temperature is 823°C with loss of the sulphate and phosphate anions.
Resumo:
Young children are the most vulnerable and most at risk of environmental challenges, current and future. Yet, early learning around environment and sustainability issues and topics has been neglected and underrated in early childhood education even though there is an expanding body of research literature – from economics, neuroscience, sociology and health – that shows that early investments in human capital offer substantial returns for individuals and for communities and have a long reach into the future. Early childhood education for sustainability (ECEfS) - a synthesis of early childhood education (ECE) and education for sustainability (EfS) - builds on groundings in play, outdoor learning and nature education, but takes a stronger focus on learning about, and engagement with, environmental and sustainability issues. Child participation and agency is central to ECEfS and can relate, for example, to local environmental problem-solving such as water and energy conservation or waste reduction in a childcare centre, kindergarten or preschool, or young children’s social learning for Indigenous Reconciliation and cultural inclusivity. While the ECE field has been much slower than other educational sectors in taking up the challenges of sustainability, this situation is rapidly changing as early childhood practitioners begin to engage – it is fast moving from the margins of early childhood curriculum and pedagogic decision-making into the mainstream. This presents challenges, however, as ECEfS is somewhat misunderstood and misrepresented and, as a new field, is under-researched and under-theorised.
Resumo:
With the growing number of XML documents on theWeb it becomes essential to effectively organise these XML documents in order to retrieve useful information from them. A possible solution is to apply clustering on the XML documents to discover knowledge that promotes effective data management, information retrieval and query processing. However, many issues arise in discovering knowledge from these types of semi-structured documents due to their heterogeneity and structural irregularity. Most of the existing research on clustering techniques focuses only on one feature of the XML documents, this being either their structure or their content due to scalability and complexity problems. The knowledge gained in the form of clusters based on the structure or the content is not suitable for reallife datasets. It therefore becomes essential to include both the structure and content of XML documents in order to improve the accuracy and meaning of the clustering solution. However, the inclusion of both these kinds of information in the clustering process results in a huge overhead for the underlying clustering algorithm because of the high dimensionality of the data. The overall objective of this thesis is to address these issues by: (1) proposing methods to utilise frequent pattern mining techniques to reduce the dimension; (2) developing models to effectively combine the structure and content of XML documents; and (3) utilising the proposed models in clustering. This research first determines the structural similarity in the form of frequent subtrees and then uses these frequent subtrees to represent the constrained content of the XML documents in order to determine the content similarity. A clustering framework with two types of models, implicit and explicit, is developed. The implicit model uses a Vector Space Model (VSM) to combine the structure and the content information. The explicit model uses a higher order model, namely a 3- order Tensor Space Model (TSM), to explicitly combine the structure and the content information. This thesis also proposes a novel incremental technique to decompose largesized tensor models to utilise the decomposed solution for clustering the XML documents. The proposed framework and its components were extensively evaluated on several real-life datasets exhibiting extreme characteristics to understand the usefulness of the proposed framework in real-life situations. Additionally, this research evaluates the outcome of the clustering process on the collection selection problem in the information retrieval on the Wikipedia dataset. The experimental results demonstrate that the proposed frequent pattern mining and clustering methods outperform the related state-of-the-art approaches. In particular, the proposed framework of utilising frequent structures for constraining the content shows an improvement in accuracy over content-only and structure-only clustering results. The scalability evaluation experiments conducted on large scaled datasets clearly show the strengths of the proposed methods over state-of-the-art methods. In particular, this thesis work contributes to effectively combining the structure and the content of XML documents for clustering, in order to improve the accuracy of the clustering solution. In addition, it also contributes by addressing the research gaps in frequent pattern mining to generate efficient and concise frequent subtrees with various node relationships that could be used in clustering.
Resumo:
This PhD study examines whether water allocation becomes more productive when it is re-allocated from 'low' to 'high' efficient alternative uses in village irrigation systems (VISs) in Sri Lanka. Reservoir-based agriculture is a collective farming economic activity, which inter-sectoral allocation of water is assumed to be inefficient due to market imperfections and weak user rights. Furthermore, the available literature shows that a „head-tail syndrome. is the most common issue for intra-sectoral water management in „irrigation. agriculture. This research analyses the issue of water allocation by using primary data collected from two surveys of 460 rice farmers and 325 fish farming groups in two administrative districts in Sri Lanka. Technical efficiency estimates are undertaken for both rice farming and culture-based fisheries (CBF) production. The equi-marginal principle is applied for inter and intra-sectoral allocation of water. Welfare benefits of water re-allocation are measured through consumer surplus estimation. Based on these analyses, the overall findings of the thesis can be summarised as follows. The estimated mean technical efficiency (MTE) for rice farming is 73%. For CBF production, the estimated MTE is 33%. The technical efficiency distribution is skewed to the left for rice farming, while it skewed to the right for CBF production. The results show that technical efficiency of rice farming can be improved by formalising transferability of land ownership and, therefore, water user rights by enhancing the institutional capacity of Farmer Organisations (FOs). Other effective tools for improving technical efficiency of CBF production are strengthening group stability of CBF farmers, improving the accessibility of official consultation, and attracting independent investments. Inter-sectoral optimal allocation shows that the estimated inefficient volume of water in rice farming, which can be re-allocated for CBF production, is 32%. With the application of successive policy instruments (e.g., a community transferable quota system and promoting CBF activities), there is potential for a threefold increase in marginal value product (MVP) of total reservoir water in VISs. The existing intra-sectoral inefficient volume of water use in tail-end fields and head-end fields can potentially be removed by reducing water use by 10% and 23% respectively and re-allocating this to middle fields. This re-allocation may enable a twofold increase in MVP of water used in rice farming without reducing the existing rice output, but will require developing irrigation practices to facilitate this re-allocation. Finally, the total productivity of reservoir water can be increased by responsible village level institutions and primary level stakeholders (i.e., co-management) sharing responsibility of water management, while allowing market forces to guide the efficient re-allocation decisions. This PhD has demonstrated that instead of farmers allocating water between uses haphazardly, they can now base their decisions on efficient water use with a view to increasing water productivity. Such an approach, no doubt will enhance farmer incomes and community welfare.
Resumo:
This article focuses on the satirical Australian show The Chaser’s War on Everything, and uses it to critically assess the potential political and social ramifications of what McNair (2006) has called ‘cultural chaos’. Drawing upon and analysing several examples from this particular program, alongside interviews with its production team and qualitative audience research, this article argues that this TV show’s engagement with politicians and political issues, in a way that departs from the conventions of traditional journalism, offers a significant opportunity for the interrogation of power. The program’s use of often bizarre and unexpected comedic confrontation allows it to present a perhaps more authentic image of political agents than is often cultivated in mainstream journalism. This suggests therefore that the shift from homogeneity to heterogeneity in the news media – which McNair (2006) sees as a key feature of cultural chaos – presents a significant challenge to those who wish to retain control over what the public sees and understands about the political world, and is a development which should be viewed in positive terms.
Resumo:
We report on analysis of discussions in an online community of people with chronic illness using socio-cognitively motivated, automatically produced semantic spaces. The analysis aims to further the emerging theory of "transition" (how people can learn to incorporate the consequences of illness into their lives). An automatically derived representation of sense of self for individuals is created in the semantic space by the analysis of the email utterances of the community members. The movement over time of the sense of self is visualised, via projection, with respect to axes of "ordinariness" and "extra-ordinariness". Qualitative evaluation shows that the visualisation is paralleled by the transitions of people during the course of their illness. The research aims to progress tools for analysis of textual data to promote greater use of tacit knowledge as found in online virtual communities. We hope it also encourages further interest in representation of sense-of-self.
Resumo:
A broad range of positions is articulated in the academic literature around the relationship between recordings and live performance. Auslander (2008) argues that “live performance ceased long ago to be the primary experience of popular music, with the result that most live performances of popular music now seek to replicate the music on the recording”. Elliott (1995) suggests that “hit songs are often conceived and produced as unambiguous and meticulously recorded performances that their originators often duplicate exactly in live performances”. Wurtzler (1992) argues that “as socially and historically produced, the categories of the live and the recorded are defined in a mutually exclusive relationship, in that the notion of the live is premised on the absence of recording and the defining fact of the recorded is the absence of the live”. Yet many artists perform in ways that fundamentally challenge such positions. Whilst it is common practice for musicians across many musical genres to compose and construct their musical works in the studio such that the recording is, in Auslander’s words, the ‘original performance’, the live version is not simply an attempt to replicate the recorded version. Indeed in some cases, such replication is impossible. There are well known historical examples. Queen, for example, never performed the a cappella sections of Bohemian Rhapsody because it they were too complex to perform live. A 1966 recording of the Beach Boys studio creation Good Vibrations shows them struggling through the song prior to its release. This paper argues that as technology develops, the lines between the recording studio and live performance change and become more blurred. New models for performance emerge. In a 2010 live performance given by Grammy Award winning artist Imogen Heap in New York, the artist undertakes a live, improvised construction of a piece as a performative act. She invites the audience to choose the key for the track and proceeds to layer up the various parts in front of the audience as a live performance act. Her recording process is thus revealed on stage in real time and she performs a process that what would have once been confined to the recording studio. So how do artists bring studio production processes into the live context? What aspects of studio production are now performable and what consistent models can be identified amongst the various approaches now seen? This paper will present an overview of approaches to performative realisations of studio produced tracks and will illuminate some emerging relationships between recorded music and performance across a range of contexts.
Resumo:
While there is strong interest in teaching values in Australia and internationally there is little focus on young children’s moral values learning in the classroom. Research shows that personal epistemology influences teaching and learning in a range of education contexts, including moral education. This study examines relationships between personal epistemologies (children’s and teachers’), pedagogies, and school contexts for moral learning in two early years classrooms. Interviews with teachers and children and analysis of school policy revealed clear patterns of personal epistemologies and pedagogies within each school. A whole school approach to understanding personal epistemologies and practice for moral values learning is suggested.
Resumo:
This paper presents the design and implementation of a microstrip to parallel strip balun which are frequently used as balanced antennas feed. This wideband balun transition is composed of a parallel strip which is connected to the spiral antenna and a microstrip line where the width of the ground plane is gradually reduced to eventually resemble the parallel strip. The taper accomplishes the mode and impedance transformation. This balun has significantly improved bandwidth characteristics. The entire circuit was fabricated on RT Duriod 5880 substrate. The circuit designs were simulated and optimised using CST Microwave Studio and the simulated results are compared with the measured results. The back-to-back microstrip to parallel strip has a return loss of better than 10 dB over a wide bandwidth from 1.75 to 15 GHz. The performance of the proposed balun was validated with the spiral antenna. The measured results were compared with the simulated results and it shows that the antenna operates well in wideband frequency range from 2.5 to 15 GHz.
Resumo:
A rule-based approach for classifying previously identified medical concepts in the clinical free text into an assertion category is presented. There are six different categories of assertions for the task: Present, Absent, Possible, Conditional, Hypothetical and Not associated with the patient. The assertion classification algorithms were largely based on extending the popular NegEx and Context algorithms. In addition, a health based clinical terminology called SNOMED CT and other publicly available dictionaries were used to classify assertions, which did not fit the NegEx/Context model. The data for this task includes discharge summaries from Partners HealthCare and from Beth Israel Deaconess Medical Centre, as well as discharge summaries and progress notes from University of Pittsburgh Medical Centre. The set consists of 349 discharge reports, each with pairs of ground truth concept and assertion files for system development, and 477 reports for evaluation. The system’s performance on the evaluation data set was 0.83, 0.83 and 0.83 for recall, precision and F1-measure, respectively. Although the rule-based system shows promise, further improvements can be made by incorporating machine learning approaches.