20 resultados para Revolutionary Tendency of Peronism

em Queensland University of Technology - ePrints Archive


Relevância:

100.00% 100.00%

Publicador:

Resumo:

'Revolutionary Self Portrait' (2009) is a sculptural self-portrait. The work comprises a bust engulfed in hair and a scalloped pedestal stand. Historically, divine energy has frequently been depicted using fluid forms - drapery, clouds and occasionally hair. These forms and associations act as a departure point for this sculpture in which the figure is depicted in a state of inundation by billowing tufts hair. The work was also inspired by the tendency of great 19th century utopian thinkers - for example Marx, Bakunin and Kropotkin - to wear large beards. Within both traditions, the language of heroic subjectivity is amplified by a sculptural extension of the body. In 'Revolutionary Self Portrait' however, this extension threatens to suffocate the subject - a gesture made all the more ironic due to the fact that the artist himself is incapable of growing a beard. The work was selected for the National Artists' Self Portrait Prize, University of Queensland Art Museum, 2009.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

This two part paper considers the experience of a range of magico-religious experiences (such as visions and voices) and spirit beliefs in a rural Aboriginal town. The papers challenge the tendency of institutionalised psychiatry to medicalise the experiences and critiques the way in which its individualistic practice is intensified in the face of an incomprehensible Aboriginal „other‟ to become part of the power imbalance that characterises the relationship between Indigenous and white domains. The work reveals the internal differentiation and politics of the Aboriginal domain, as the meanings of these experiences and actions are contested and negotiated by the residents and in so doing they decentre the concerns of the white domain and attempt to control their relationship with it. Thus the plausibility structure that sustains these multiple realities reflects both accommodation and resistance to the material and historical conditions imposed and enacted by mainstream society on the residents, and to current socio- political realities. I conclude that the residents‟ narratives chart the grounds of moral adjudication as the experiences were rarely conceptualised by local people as signs of individual pathology but as reflections of social reality. Psychiatric drug therapy and the behaviourist assumptions underlying its practice posit atomised individuals as the appropriate site of intervention as against the multiple realities revealed by the phenomenology of the experiences. The papers thus call into question Australian mainstream „commonsense‟ that circulates about Aboriginal and Torres Strait Islander people which justifies representations of them as sickly outcasts in Australian society.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Typical reference year (TRY) weather data is often used to represent the long term weather pattern for building simulation and design. Through the analysis of ten year historical hourly weather data for seven Australian major capital cities using the frequencies procedure of descriptive statistics analysis (by SPSS software), this paper investigates: • the closeness of the typical reference year (TRY) weather data in representing the long term weather pattern; • the variations and common features that may exist between relatively hot and cold years. It is found that for the given set of input data, in comparison with the other weather elements, the discrepancy between TRY and multiple years is much smaller for the dry bulb temperature, relative humidity and global solar irradiance. The overall distribution patterns of key weather elements are also generally similar between the hot and cold years, but with some shift and/or small distortion. There is little common tendency of change between the hot and the cold years for different weather variables at different study locations.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The water sensitivity of authigenic smectite- and illite-rich illite/smectites in sandstone reservoirs has been investigated using an Environmental Scanning Electron Microscope (ESEM). The ESEM enabled the illite/smectites to be directly observed in situ at high magnification during freshwater immersion, and was also particularly effective in allowing the same selected illite/smectite areas to be closely compared before and after freshwater treatments. The tendency of authigenic smectite-rich illite/smectite to swell on contact with fresh water varies greatly. Smectite-rich illite/smectite may osmotically swell to many times its original volume to form a gel which greatly reduces porosity and permeability, or may undergo only a subtle morphological change which has little or no adverse effect on reservoir quality. Authigenic illite-rich illite/smectite in sandstones does not swell when immersed in fresh water. Even after prolonged soaking in fresh water, illite-rich illite/smectite particles retain their original morphology. Accordingly, illite-rich illite/smectite in sandstones is unlikely to cause formation damage if exposed to freshwater-based fluids. © 1993.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Nanowires (NWs) have attracted appealing and broad application owing to their remarkable mechanical, optical, electrical, thermal and other properties. To unlock the revolutionary characteristics of NWs, a considerable body of experimental and theoretical work has been conducted. However, due to the extremely small dimensions of NWs, the application and manipulation of the in situ experiments involve inherent complexities and huge challenges. For the same reason, the presence of defects appears as one of the most dominant factors in determining their properties. Hence, based on the experiments' deficiency and the necessity of investigating different defects' influence, the numerical simulation or modelling becomes increasingly important in the area of characterizing the properties of NWs. It has been noted that, despite the number of numerical studies of NWs, significant work still lies ahead in terms of problem formulation, interpretation of results, identification and delineation of deformation mechanisms, and constitutive characterization of behaviour. Therefore, the primary aim of this study was to characterize both perfect and defected metal NWs. Large-scale molecular dynamics (MD) simulations were utilized to assess the mechanical properties and deformation mechanisms of different NWs under diverse loading conditions including tension, compression, bending, vibration and torsion. The target samples include different FCC metal NWs (e.g., Cu, Ag, Au NWs), which were either in a perfect crystal structure or constructed with different defects (e.g. pre-existing surface/internal defects, grain/twin boundaries). It has been found from the tensile deformation that Young's modulus was insensitive to different styles of pre-existing defects, whereas the yield strength showed considerable reduction. The deformation mechanisms were found to be greatly influenced by the presence of defects, i.e., different defects acted in the role of dislocation sources, and many affluent deformation mechanisms had been triggered. Similar conclusions were also obtained from the compressive deformation, i.e., Young's modulus was insensitive to different defects, but the critical stress showed evident reduction. Results from the bending deformation revealed that the current modified beam models with the considerations of surface effect, or both surface effect and axial extension effect were still experiencing certain inaccuracy, especially for the NW with ultra small cross-sectional size. Additionally, the flexural rigidity of the NW was found to be insensitive to different pre-existing defects, while the yield strength showed an evident decrease. For the resonance study, the first-order natural frequency of the NW with pre-existing surface defects was almost the same as that from the perfect NW, whereas a lower first-order natural frequency and a significantly degraded quality factor was observed for NWs with grain boundaries. Most importantly, the <110> FCC NWs were found to exhibit a novel beat phenomenon driven by a single actuation, which was resulted from the asymmetry in the lattice spacing in the (110) plane of the NW cross-section, and expected to exert crucial impacts on the in situ nanomechanical measurements. In particular, <110> Ag NWs with rhombic, truncated rhombic, and triangular cross-sections were found to naturally possess two first-mode natural frequencies, which were envisioned with applications in NEMS that could operate in a non-planar regime. The torsion results revealed that the torsional rigidity of the NW was insensitive to the presence of pre-existing defects and twin boundaries, but received evident reduction due to grain boundaries. Meanwhile, the critical angle decreased considerably for defected NWs. This study has provided a comprehensive and deep investigation on the mechanical properties and deformation mechanisms of perfect and defected NWs, which will greatly extend and enhance the existing knowledge and understanding of the properties/performance of NWs, and eventually benefit the realization of their full potential applications. All delineated MD models and theoretical analysis techniques that were established for the target NWs in this research are also applicable to future studies on other kinds of NWs. It has been suggested that MD simulation is an effective and excellent tool, not only for the characterization of the properties of NWs, but also for the prediction of novel or unexpected properties.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

In 2012, Queensland University of Technology (QUT) committed to the massive project of revitalizing its Bachelor of Science (ST01) degree. Like most universities in Australia, QUT has begun work to align all courses by 2015 to the requirements of the updated Australian Qualifications Framework (AQF) which is regulated by the Tertiary Education Quality and Standards Agency (TEQSA). From the very start of the redesigned degree program, students approach scientific study with an exciting mix of theory and highly topical real world examples through their chosen “grand challenge.” These challenges, Fukushima and nuclear energy for example, are the lenses used to explore science and lead to 21st century learning outcomes for students. For the teaching and learning support staff, our grand challenge is to expose all science students to multidisciplinary content with a strong emphasis on embedding information literacies into the curriculum. With ST01, QUT is taking the initiative to rethink not only content but how units are delivered and even how we work together between the faculty, the library and learning and teaching support. This was the desired outcome but as we move from design to implementation, has this goal been achieved? A main component of the new degree is to ensure scaffolding of information literacy skills throughout the entirety of the three year course. However, with the strong focus on problem-based learning and group work skills, many issues arise both for students and lecturers. A move away from a traditional lecture style is necessary but impacts on academics’ workload and comfort levels. Therefore, academics in collaboration with librarians and other learning support staff must draw on each others’ expertise to work together to ensure pedagogy, assessments and targeted classroom activities are mapped within and between units. This partnership can counteract the tendency of isolated, unsupported academics to concentrate on day-to-day teaching at the expense of consistency between units and big picture objectives. Support staff may have a more holistic view of a course or degree than coordinators of individual units, making communication and truly collaborative planning even more critical. As well, due to staffing time pressures, design and delivery of new curriculum is generally done quickly with no option for the designers to stop and reflect on the experience and outcomes. It is vital we take this unique opportunity to closely examine what QUT has and hasn’t achieved to be able to recommend a better way forward. This presentation will discuss these important issues and stumbling blocks, to provide a set of best practice guidelines for QUT and other institutions. The aim is to help improve collaboration within the university, as well as to maximize students’ ability to put information literacy skills into action. As our students embark on their own grand challenges, we must challenge ourselves to honestly assess our own work.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Characteristics of electrical breakdown of a planar magnetron enhanced with an electromagnet and a hollow-cathode structure, are studied experimentally and numerically. At lower pressures the breakdown voltage shows a dependence on the applied magnetic field, and the voltage necessary to achieve the self-sustained discharge regime can be significantly reduced. At higher pressures, the dependence is less sensitive to the magnetic field magnitude and shows a tendency of increased breakdown voltage at the stronger magnetic fields. A model of the magnetron discharge breakdown is developed with the background gas pressure and the magnetic field used as parameters. The model describes the motion of electrons, which gain energy by passing the electric field across the magnetic field and undergo collisions with neutrals, thus generating new bulk electrons. The electrons are in turn accelerated in the electric field and effectively ionize a sufficient amount of neutrals to enable the discharge self-sustainment regime. The model is based on the assumption about the combined classical and near-wall mechanisms of electron conductivity across the magnetic field, and is consistent with the experimental results. The obtained results represent a significant advance toward energy-efficient multipurpose magnetron discharges.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Over the past 20 years the labour market, workforce and work organisation of most if not all industrialised countries have been significantly refashioned by the increased use of more flexible work arrangements, variously labelled as precarious employment or contingent work. There is now a substantial and growing body of international evidence that many of these arrangements are associated with a significant deterioration in occupational health and safety (OHS), using a range of measures such as injury rates, disease, hazard exposures and work-related stress. Moreover, there is an emerging body of evidence that these arrangements pose particular problems for conventional regulatory regimes. Recognition of these problems has aroused the concern of policy makers - especially in Europe, North America and Australia - and a number of responses have been adopted in terms of modifying legislation, producing new guidance material and codes of practice and revised enforcement practices. This article describes one such in itiative in Australia with regard to home-based clothing workers. The regulatory strategy developed in one Australian jurisdiction (and now being ‘exported’ into others) seeks to counter this process via contractual tracking mechanisms to follow the work, tie in liability and shift overarching legal responsibility to the top of the supply chain. The process also entails the integration of minimum standards relating to wages, hours and working conditions; OHS and access to workers’ compensation. While home-based clothing manufacture represents a very old type of ‘flexible’ work arrangement, it is one that regulators have found especially difficult to address. Further, the elaborate multi-tiered subcont racting and diffuse work locations found in this industry are also characteristic of newer forms of contingent work in other industries (such as some telework) and the regulatory challenges they pose (such as the tendency of elaborate supply chains to attenuate and fracture statutory responsibilities, at least in terms of the attitudes and behaviour of those involved).

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Background: In 2011, Australia published a set of 6 population-level indicators assessing breastfeeding, formula use, and the introduction of soft/semisolid/solid foods. Objectives: This study aimed to report the feeding practices of Australian infants against these indicators and determine the predictors of early breastfeeding cessation and introduction of solids. Methods: Mother–infant dyads (N = 1470) were recruited postnatally in 2 Australian capital cities and regional areas of 1 state between February 2008 and March 2009. Demographic and feeding intention data were collected by self-completed questionnaire at infant birth, with feeding practices (current feeding mode, age of breastfeeding cessation, age of formula and/or solids introduction) reported when the infant was between 4 and 7 months of age, and around 13 months of age. Multiple logistic regression was used to determine the predictors of breastfeeding cessation and solids introduction. Results: Although initiation of breastfeeding was almost universal (93.3%), less than half of the infants were breastfed to 6 months (41.7%) and 33.3% were receiving solids by 4 months. Women who were socially disadvantaged, younger, less educated, unpartnered, primiparous, and/or overweight were most likely to have ceased breastfeeding before 6 months of age, and younger and/or less educated women were most likely to have introduced solid food by 4 months of age. Not producing adequate milk was the most common reason provided for cessation of breastfeeding. Conclusion: The feeding behaviors of Australian infants in the first 12 months fall well short of recommendations. Women need anticipatory guidance as to the indicators of breastfeeding success and the tendency of women to doubt the adequacy of their breast milk supply warrants further investigation.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Marketers and commercial media alike are confronted by shifts in the social relations of media production and consumption in the global services economy, including the challenge of capturing, managing and commercialising media-user productivity. This trajectory of change in media cultures and economies is described here as ‘mass conversation’. Two media texts and a new media object provide a starting point for charting the ascendance and social impact of mass conversation. Apple’s 1984 television commercial, which launched the Macintosh computer, inverted George Orwell’s dystopian vision of the social consequences of panoptic communications systems. It invoked a revolutionary rhetoric to anticipate the social consequences of a new type of interactivity since theorised as ‘intercreativity’. This television commercial is contrasted with another used in Nike’s 2006 launch of its Nike+ (Apple iPod) system. The Nike+ online brand community is also used to consider how a multiplatform brand channel is seeking to manage the changing norms and practices of consumption and end-user agency. This analysis shows that intercreativity modifies the operations of ‘Big Brother’ but serves the more mundane than revolutionary purpose of generating commercial value from the affective labour of end-users.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Modern Engineering Asset Management (EAM) requires the accurate assessment of current and the prediction of future asset health condition. Appropriate mathematical models that are capable of estimating times to failures and the probability of failures in the future are essential in EAM. In most real-life situations, the lifetime of an engineering asset is influenced and/or indicated by different factors that are termed as covariates. Hazard prediction with covariates is an elemental notion in the reliability theory to estimate the tendency of an engineering asset failing instantaneously beyond the current time assumed that it has already survived up to the current time. A number of statistical covariate-based hazard models have been developed. However, none of them has explicitly incorporated both external and internal covariates into one model. This paper introduces a novel covariate-based hazard model to address this concern. This model is named as Explicit Hazard Model (EHM). Both the semi-parametric and non-parametric forms of this model are presented in the paper. The major purpose of this paper is to illustrate the theoretical development of EHM. Due to page limitation, a case study with the reliability field data is presented in the applications part of this study.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

A pervasive and puzzling feature of banks’ Value-at-Risk (VaR) is its abnormally high level, which leads to excessive regulatory capital. A possible explanation for the tendency of commercial banks to overstate their VaR is that they incompletely account for the diversification effect among broad risk categories (e.g., equity, interest rate, commodity, credit spread, and foreign exchange). By underestimating the diversification effect, bank’s proprietary VaR models produce overly prudent market risk assessments. In this paper, we examine empirically the validity of this hypothesis using actual VaR data from major US commercial banks. In contrast to the VaR diversification hypothesis, we find that US banks show no sign of systematic underestimation of the diversification effect. In particular, diversification effects used by banks is very close to (and quite often larger than) our empirical diversification estimates. A direct implication of this finding is that individual VaRs for each broad risk category, just like aggregate VaRs, are biased risk assessments.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

This paper presents a solution to the problem of estimating the monotonous tendency of a slow-varying oscillating system. A recursive Prony Analysis (PA) scheme is developed which involves obtaining a dynamic model with parameters identified by implementing the forgetting factor recursive least square (FFRLS) method. A box threshold principle is proposed to separate the dominant components, which results in an accurate estimation of the trend of oscillating systems. Performance of the proposed PA is evaluated using real-time measurements when random noise and vibration effects are present. Moreover, the proposed method is used to estimate monotonous tendency of deck displacement to assist in a safe landing of an unmanned aerial vehicle (UAV). It is shown that the proposed method can estimate instantaneous mean deck satisfactorily, making it well suited for integration into ship-UAV approach and landing guidance systems.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Runt related transcription factor 2 (RUNX2) is a key regulator of osteoblast differentiation. Several variations within RUNX2 have been found to be associated with significant changes in BMD, which is a major risk factor for fracture. In this study we report that an 18bp deletion within the polyalanine tract (17A>11A) of RUNX2 is significantly associated with fracture. Carriers of the 11A allele were found to be nearly twice as likely to have sustained fracture. Within the fracture category, there was a significant tendency of 11A carriers to present with fractures of bones of intramembranous origin compared to bones of endochondral origin (p=0.005). In a population of random subjects, the 11A allele was associated with decreased levels of serum collagen cross links (CTx, p=0.01), suggesting decreased bone turnover. The transactivation function of the 11A allele was quantitatively decreased. Interestingly, we found no effect of the 11A allele on BMD at multiple skeletal sites, although these were not the sites where a relationship with fracture was most evident. These findings suggest that the 11A allele is a biologically relevant polymorphism that influences serum CTx and confers enhanced fracture risk in a site-selective manner related to intramembranous bone ossification.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

This paper analyses the probabilistic linear discriminant analysis (PLDA) speaker verification approach with limited development data. This paper investigates the use of the median as the central tendency of a speaker’s i-vector representation, and the effectiveness of weighted discriminative techniques on the performance of state-of-the-art length-normalised Gaussian PLDA (GPLDA) speaker verification systems. The analysis within shows that the median (using a median fisher discriminator (MFD)) provides a better representation of a speaker when the number of representative i-vectors available during development is reduced, and that further, usage of the pair-wise weighting approach in weighted LDA and weighted MFD provides further improvement in limited development conditions. Best performance is obtained using a weighted MFD approach, which shows over 10% improvement in EER over the baseline GPLDA system on mismatched and interview-interview conditions.