975 resultados para Minstrel shows.
Resumo:
Thermal transformations of natural calcium oxalate dihydrate known in mineralogy as weddellite have been undertaken using a combination of Raman microscopy and infrared emission spectroscopy. The vibrational spectroscopic data was complimented with high resolution thermogravimetric analysis combined with evolved gas mass spectrometry. TG–MS identified three mass loss steps at 114, 422 and 592 °C. In the first mass loss step water is evolved only, in the second and third steps carbon dioxide is evolved. The combination of Raman microscopy and a thermal stage clearly identifies the changes in the molecular structure with thermal treatment. Weddellite is the phase in the temperature range up to the pre-dehydration temperature of 97 °C. At this temperature, the phase formed is whewellite (calcium oxalate monohydrate) and above 114 °C the phase is the anhydrous calcium oxalate. Above 422 °C, calcium carbonate is formed. Infrared emission spectroscopy shows that this mineral decomposes at around 650 °C. Changes in the position and intensity of the C=O and C---C stretching vibrations in the Raman spectra indicate the temperature range at which these phase changes occur.
Resumo:
This paper presents a prototype tracking system for tracking people in enclosed indoor environments where there is a high rate of occlusions. The system uses a stereo camera for acquisition, and is capable of disambiguating occlusions using a combination of depth map analysis, a two step ellipse fitting people detection process, the use of motion models and Kalman filters and a novel fit metric, based on computationally simple object statistics. Testing shows that our fit metric outperforms commonly used position based metrics and histogram based metrics, resulting in more accurate tracking of people.
Resumo:
Process modeling can be regarded as the currently most popular form of conceptual modeling. Research evidence illustrates how process modeling is applied across the different information system life cycle phases for a range of different applications, such as configuration of Enterprise Systems, workflow management, or software development. However, a detailed discussion of critical factors of the quality of process models is still missing. This paper proposes a framework consisting of six quality factors, which is derived from a comprehensive literature review. It then presents in a case study, a utility provider, who had designed various business process models for the selection of an Enterprise System. The paper summarizes potential means of conducting a successful process modeling initiative and evaluates the described modeling approach within the Guidelines of Modeling (GoM) framework. An outlook shows the potential lessons learnt, and concludes with insights to the next phases of this study.
Resumo:
The dynamic interaction between building systems and external climate is extremely complex, involving a large number of difficult-to-predict variables. In order to study the impact of global warming on the built environment, the use of building simulation techniques together with forecast weather data are often necessary. Since all building simulation programs require hourly meteorological input data for their thermal comfort and energy evaluation, the provision of suitable weather data becomes critical. Based on a review of the existing weather data generation models, this paper presents an effective method to generate approximate future hourly weather data suitable for the study of the impact of global warming. Depending on the level of information available for the prediction of future weather condition, it is shown that either the method of retaining to current level, constant offset method or diurnal modelling method may be used to generate the future hourly variation of an individual weather parameter. An example of the application of this method to the different global warming scenarios in Australia is presented. Since there is no reliable projection of possible change in air humidity, solar radiation or wind characters, as a first approximation, these parameters have been assumed to remain at the current level. A sensitivity test of their impact on the building energy performance shows that there is generally a good linear relationship between building cooling load and the changes of weather variables of solar radiation, relative humidity or wind speed.
Resumo:
Many studies have been carried out in relation to construction procurement methods. Evidence shows that there needs to be a change of culture and attitude in the construction industry, moving away from traditional adversarial relationship into cooperative and collaborative relationship. At the same time there is also an increasing concern and discussion on alternative procurement methods, drifting away from traditional procurement systems. Relational contracting approaches have become more popular in recent years, and have appeared in common forms such as partnering, alliancing and relationship management contracts. This paper reports the findings of a survey undertaken with a private organisation based on an alliance project during its design stage, identifying the critical factors that influence the success of the alliance project. Legal aspects focusing on dispute resolution in alliancing are also highlighted.
Resumo:
Person tracking systems are dependent on being able to locate a person accurately across a series of frames. Optical flow can be used to segment a moving object from a scene, provided the expected velocity of the moving object is known; but successful detection also relies on being able segment the background. A problem with existing optical flow techniques is that they don’t discriminate the foreground from the background, and so often detect motion (and thus the object) in the background. To overcome this problem, we propose a new optical flow technique, that is based upon an adaptive background segmentation technique, which only determines optical flow in regions of motion. This technique has been developed with a view to being used in surveillance systems, and our testing shows that for this application it is more effective than other standard optical flow techniques.
How does ‘Newstainment’ actually work? : ethnographic research methods and contemporary popular news
Resumo:
Much debate has taken place recently over the potential for entertainment genres and unorthodox forms of news to provide legitimate – indeed democratized – in-roads into the public sphere. Amidst these discussions, however, little thought has been paid to the audiences for programs of this sort, and (even when viewers are considered) the research can too easily treat audiences in homogenous terms and therefore replicate the very dichotomies these television shows directly challenge. This paper is a critical reflection on an audience study into the Australian morning “newstainment” program Sunrise. After examining the show and exploring how it is ‘used’ as a news source, this paper will promote the use of ethnographic study to better conceptualize how citizens integrate and connect the increasingly fragmented and multifarious forms of postmodern political communication available in their everyday lives.
Resumo:
Facing with the difficulty in information propagation and synthesizing from conceptual to embodiment design, this paper introduces a function-oriented, axiom based conceptual modeling scheme. Default logic reasoning is exploited for recognition and reconstitution of conceptual product geometric and topological information. The proposed product modeling system and reasoning approach testify a methodology of "structural variation design", which is verified in the implementation of a GPAL (Green Product All Life-cycle) CAD system. The GPAL system includes major enhancement modules of a mechanism layout sketching method based on fuzzy logic, a knowledge-based function-to-form mapping mechanism and conceptual form reconstitution paradigm based on default geometric reasoning. A mechanical hand design example shows a more than 20 times increase in design efficacy with these enhancement modules in the GPAL system on a general 3D CAD platform.
Resumo:
Cholesterol-lowering treatment by statins is an important and costly issue; however, its role in stroke has not been well documented. The aim of the present study was to review literature and current practice regarding cholesterol-lowering treatment for stroke patients. A literature review was conducted on lipids in stroke and their management with both statins and diet, including the cost-effectiveness of medical nutrition therapy. Qualifying criteria and prescription procedures of the Pharmaceutical Benefits Scheme (PBS) were also reviewed. Data on lipid levels and statin prescriptions were analysed for 468 patients admitted to a stroke unit. The literature shows that management with both medication and diet can be effective, especially when combined; however, 60% of patients with an ischaemic event had fasting total cholesterol measures ≥4 mmol/L (n = 231), with only 52% prescribed statins on discharge (n = 120). Hypercholesterolaemia is an underdiagnosed and undertreated risk factor within the stroke population. It appears that the PBS has not kept pace with advances in the evidence in terms of statin use in the stroke population, and review is needed. The present review should address the qualifying criteria for the stroke population and recommendations on referral to dietitians for dietary advice. Cholesterol-lowering treatment for both stroke patients and the wider population is an area that needs awareness raising and review by the PBS, medical practitioners and dietitians. The role of dietary and pharmacological treatments needs to be clearly defined, including adjunct therapy, and the cost-effectiveness of medical nutrition therapy realised.
Resumo:
Solo exhibition of sculptural works that use the portrait bust as a vehicle for problematising notions of subjectivity, authority and representation. The exhibition comprised three life-sized figurative busts, each portraits of the artist, sparsely positioned throughout the gallery space to convey a sense of isolation and abandonment. By emphasising the fragmented nature of the bust format by removal of all supports (ie. Socle, plinth or alcove) the works sought to address the vulnerability that frmes this apparently authoritative Enlightenment portrait format. In so doing the exhibition aimed to offer, by example, a new way of seeing and interpreting the portrait bust in history. The exhibition was exhibited at the Institute of Modern Art (Brisbane) and the Perth Institute of Contemporary Arts. Works fro the exhibition were included in group shows at Linden Centre for Contemporary Arts, Ballarat Fine Art Gallery. Work from the exhibition was purchased for the collection of MONA, Hobart.The exhibition received favourable reviews in Eyeline, Art and Australia and Machine magazines.
Resumo:
Temporal variations caused by pedestrian movement can significantly affect the channel capacity of indoor MIMOOFDM wireless systems. This paper compares systematic measurements of MIMO-OFDM channel capacity in presence of pedestrians with predicted MIMO-OFDM channel capacity values using geometric optics-based ray tracing techniques. Capacity results are presented for a single room environment using 5.2 GHz with 2x2, 3x3 and 4x4 arrays as well as a 2.45 GHz narrowband 8x8 MIMO array. The analysis shows an increase of up to 2 b/s/Hz on instant channel capacity with up to 3 pedestrians. There is an increase of up to 1 b/s/Hz in the average capacity of the 4x4 MIMO-OFDM channel when the number of pedestrians goes from 1 to 3. Additionally, an increment of up to 2.5 b/s/Hz in MIMO-OFDM channel capacity was measured for a 4x4 array compared to a 2x2 array in presence of pedestrians. Channel capacity values derived from this analysis are important in terms of understanding the limitations and possibilities for MIMO-OFDM systems in indoor populated environments.
Resumo:
Search engines have forever changed the way people access and discover knowledge, allowing information about almost any subject to be quickly and easily retrieved within seconds. As increasingly more material becomes available electronically the influence of search engines on our lives will continue to grow. This presents the problem of how to find what information is contained in each search engine, what bias a search engine may have, and how to select the best search engine for a particular information need. This research introduces a new method, search engine content analysis, in order to solve the above problem. Search engine content analysis is a new development of traditional information retrieval field called collection selection, which deals with general information repositories. Current research in collection selection relies on full access to the collection or estimations of the size of the collections. Also collection descriptions are often represented as term occurrence statistics. An automatic ontology learning method is developed for the search engine content analysis, which trains an ontology with world knowledge of hundreds of different subjects in a multilevel taxonomy. This ontology is then mined to find important classification rules, and these rules are used to perform an extensive analysis of the content of the largest general purpose Internet search engines in use today. Instead of representing collections as a set of terms, which commonly occurs in collection selection, they are represented as a set of subjects, leading to a more robust representation of information and a decrease of synonymy. The ontology based method was compared with ReDDE (Relevant Document Distribution Estimation method for resource selection) using the standard R-value metric, with encouraging results. ReDDE is the current state of the art collection selection method which relies on collection size estimation. The method was also used to analyse the content of the most popular search engines in use today, including Google and Yahoo. In addition several specialist search engines such as Pubmed and the U.S. Department of Agriculture were analysed. In conclusion, this research shows that the ontology based method mitigates the need for collection size estimation.
Resumo:
The Chaser’s War on Everything is a night time entertainment program which screened on Australia’s public broadcaster, the ABC in 2006 and 2007. This enormously successful comedy show managed to generate a lot of controversy in its short lifespan (see, for example, Dennehy, 2007; Dubecki, 2007; McLean, 2007; Wright, 2007), but also drew much praise for its satirising of, and commentary on, topical issues. Through interviews with the program’s producers, qualitative audience research and textual analysis, this paper will focus on this show’s media satire, and the segment ‘What Have We Learned From Current Affairs This Week?’ in particular. Viewed as a form of ‘Critical Intertextuality’ (Gray, 2006), this segment (which offered a humorous critique of the ways in which news and current affairs are presented elsewhere on television) may equip citizens with a better understanding of the new genre’s production methods, thus producing a higher level of public media literacy. This paper argues that through its media satire, The Chaser acts not as a traditional news program would in informing the public with new information, but as a text which can inform and shape our understanding of news that already exists within the public sphere. Humorous analyses and critiques of the media (like those analysed in this paper), are in fact very important forms of infotainment, because they can provide “other, ‘improper,’ and yet more media literate and savvy interpretations” (Gray, 2006, p. 4) of the news.
Resumo:
This paper examines the vibration characteristics and vibration control of complex ship structures. It is shown that input mobilities of a ship structure at engine supports, due to out-of-plane force or bending moment excitations, are governed by the flexural stiffness of the engine supports. The frequency averaged input mobilities of the ship structure, due to such excitations, can be represented by those of the corresponding infinite beam. The torsional moment input mobility at the engine support can be estimated from the torsional response of the engine bed section under direct excitation. It is found that the inclusion of ship hull and deck plates in the ship structure model has little effect on the frequency-averaged response of the ship structure. This study also shows that vibration propagation in complex ship structures at low frequencies can be attenuated by imposing irregularities to the ring frame locations in ships. Vibration responses of ship structures due to machinery excitations at higher frequencies can be controlled by structural modifications of the local supporting structures such as engine beds in ships.
Resumo:
With the advent of Service Oriented Architecture, Web Services have gained tremendous popularity. Due to the availability of a large number of Web services, finding an appropriate Web service according to the requirement of the user is a challenge. This warrants the need to establish an effective and reliable process of Web service discovery. A considerable body of research has emerged to develop methods to improve the accuracy of Web service discovery to match the best service. The process of Web service discovery results in suggesting many individual services that partially fulfil the user’s interest. By considering the semantic relationships of words used in describing the services as well as the use of input and output parameters can lead to accurate Web service discovery. Appropriate linking of individual matched services should fully satisfy the requirements which the user is looking for. This research proposes to integrate a semantic model and a data mining technique to enhance the accuracy of Web service discovery. A novel three-phase Web service discovery methodology has been proposed. The first phase performs match-making to find semantically similar Web services for a user query. In order to perform semantic analysis on the content present in the Web service description language document, the support-based latent semantic kernel is constructed using an innovative concept of binning and merging on the large quantity of text documents covering diverse areas of domain of knowledge. The use of a generic latent semantic kernel constructed with a large number of terms helps to find the hidden meaning of the query terms which otherwise could not be found. Sometimes a single Web service is unable to fully satisfy the requirement of the user. In such cases, a composition of multiple inter-related Web services is presented to the user. The task of checking the possibility of linking multiple Web services is done in the second phase. Once the feasibility of linking Web services is checked, the objective is to provide the user with the best composition of Web services. In the link analysis phase, the Web services are modelled as nodes of a graph and an allpair shortest-path algorithm is applied to find the optimum path at the minimum cost for traversal. The third phase which is the system integration, integrates the results from the preceding two phases by using an original fusion algorithm in the fusion engine. Finally, the recommendation engine which is an integral part of the system integration phase makes the final recommendations including individual and composite Web services to the user. In order to evaluate the performance of the proposed method, extensive experimentation has been performed. Results of the proposed support-based semantic kernel method of Web service discovery are compared with the results of the standard keyword-based information-retrieval method and a clustering-based machine-learning method of Web service discovery. The proposed method outperforms both information-retrieval and machine-learning based methods. Experimental results and statistical analysis also show that the best Web services compositions are obtained by considering 10 to 15 Web services that are found in phase-I for linking. Empirical results also ascertain that the fusion engine boosts the accuracy of Web service discovery by combining the inputs from both the semantic analysis (phase-I) and the link analysis (phase-II) in a systematic fashion. Overall, the accuracy of Web service discovery with the proposed method shows a significant improvement over traditional discovery methods.