132 resultados para CLASSIC ARTICLES


Relevância:

20.00% 20.00%

Publicador:

Resumo:

On July 25, 2014, Justice David Davies sentenced Jonathan Moylan at the Supreme Court of New South Wales for a breach of section 1041E (1) of the Corporations Act 2001 (Cth). The ruling is a careful and deliberate decision, showing equipoise. Justice Davies has a reputation for being a thoughtful and philosophical adjudicator. The judge convicted and sentenced Moylan to imprisonment for 1 year and 8 months. The judge ordered that Moylan be “immediately released upon giving security by way of recognisance in the sum of $1000 to be of good behaviour for a period of 2 years commencing today”.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Classic identity negative priming (NP) refers to the finding that when an object is ignored, subsequent naming responses to it are slower than when it has not been previously ignored (Tipper, S.P., 1985. The negative priming effect: inhibitory priming by ignored objects. Q. J. Exp. Psychol. 37A, 571-590). It is unclear whether this phenomenon arises due to the involvement of abstract semantic representations that the ignored object accesses automatically. Contemporary connectionist models propose a key role for the anterior temporal cortex in the representation of abstract semantic knowledge (e.g., McClelland, J.L., Rogers, T.T., 2003. The parallel distributed processing approach to semantic cognition. Nat. Rev. Neurosci. 4, 310-322), suggesting that this region should be involved during performance of the classic identity NP task if it involves semantic access. Using high-field (4 T) event-related functional magnetic resonance imaging, we observed increased BOLD responses in the left anterolateral temporal cortex including the temporal pole that was directly related to the magnitude of each individual's NP effect, supporting a semantic locus. Additional signal increases were observed in the supplementary eye fields (SEF) and left inferior parietal lobule (IPL).

Relevância:

20.00% 20.00%

Publicador:

Resumo:

We thank Ploski and colleagues for their interest in our study. The explanation for the difference in our findings is a typographic error in Table 2 of our article, whereby the alleles for marker TNF ⫺1031 were labeled incorrectly...

Relevância:

20.00% 20.00%

Publicador:

Resumo:

In this paper, a refined classic noise prediction method based on the VISSIM and FHWA noise prediction model is formulated to analyze the sound level contributed by traffic on the Nanjing Lukou airport connecting freeway before and after widening. The aim of this research is to (i) assess the traffic noise impact on the Nanjing University of Aeronautics and Astronautics (NUAA) campus before and after freeway widening, (ii) compare the prediction results with field data to test the accuracy of this method, (iii) analyze the relationship between traffic characteristics and sound level. The results indicate that the mean difference between model predictions and field measurements is acceptable. The traffic composition impact study indicates that buses (including mid-sized trucks) and heavy goods vehicles contribute a significant proportion of total noise power despite their low traffic volume. In addition, speed analysis offers an explanation for the minor differences in noise level across time periods. Future work will aim at reducing model error, by focusing on noise barrier analysis using the FEM/BEM method and modifying the vehicle noise emission equation by conducting field experimentation.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

This paper considers how the Internet can be used to leverage commercial sponsorships to enhance audience attitudes toward the sponsor. Definitions are offered that distinguish the terms leverage and activation with respect to sponsorship-linked marketing; leveraging encompasses all marketing communications collateral to the sponsorship investment, whereas activation relates to those communications that encourage interaction with the sponsor. Although activation in many instances may be limited to the immediate event-based audience, leveraging sponsorships via sponsors' Web sites enables activation at the mass-media audience level. Results of a Web site navigation experiment demonstrate that activational sponsor Web sites promote more favorable attitudes than do nonactivational Web sites. It is also shown that sponsorsponsee congruence effects generalize to the online environment, and that the effects of sponsorship articulation on audience attitudes are moderated by the commerciality of the explanation for the sponsor-sponsee relationship. Importantly, the study reveals that attitudinal effects associated with variations in leveraging, congruence, and orientation of articulation may be sustained across time.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Designers and artists have integrated recent advances in interactive, tangible and ubiquitous computing technologies to create new forms of interactive environments in the domains of work, recreation, culture and leisure. Many designs of technology systems begin with the workplace in mind, and with function, ease of use, and efficiency high on the list of priorities. [1] These priorities do not fit well with works designed for an interactive art environment, where the aims are many, and where the focus on utility and functionality is to support a playful, ambiguous or even experimental experience for the participants. To evaluate such works requires an integration of art-criticism techniques with more recent Human Computer Interaction (HCI) methods, and an understanding of the different nature of engagement in these environments. This paper begins a process of mapping a set of priorities for amplifying engagement in interactive art installations. I first define the concept of ludic engagement and its usefulness as a lens for both design and evaluation in these settings. I then detail two fieldwork evaluations I conducted within two exhibitions of interactive artworks, and discuss their outcomes and the future directions of this research.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Non Alcoholic Fatty Liver Disease (NAFLD) is a condition that is frequently seen but seldom investigated. Until recently, NAFLD was considered benign, self-limiting and unworthy of further investigation. This opinion is based on retrospective studies with relatively small numbers and scant follow-up of histology data. (1) The prevalence for adults, in the USA is, 30%, and NAFLD is recognized as a common and increasing form of liver disease in the paediatric population (1). Australian data, from New South Wales, suggests the prevalence of NAFLD in “healthy” 15 year olds as being 10%.(2) Non-alcoholic fatty liver disease is a condition where fat progressively invades the liver parenchyma. The degree of infiltration ranges from simple steatosis (fat only) to steatohepatitis (fat and inflammation) steatohepatitis plus fibrosis (fat, inflammation and fibrosis) to cirrhosis (replacement of liver texture by scarred, fibrotic and non functioning tissue).Non-alcoholic fatty liver is diagnosed by exclusion rather than inclusion. None of the currently available diagnostic techniques -liver biopsy, liver function tests (LFT) or Imaging; ultrasound, Computerised tomography (CT) or Magnetic Resonance Imaging (MRI) are specific for non-alcoholic fatty liver. An association exists between NAFLD, Non Alcoholic Steatosis Hepatitis (NASH) and irreversible liver damage, cirrhosis and hepatoma. However, a more pervasive aspect of NAFLD is the association with Metabolic Syndrome. This Syndrome is categorised by increased insulin resistance (IR) and NAFLD is thought to be the hepatic representation. Those with NAFLD have an increased risk of death (3) and it is an independent predictor of atherosclerosis and cardiovascular disease (1). Liver biopsy is considered the gold standard for diagnosis, (4), and grading and staging, of non-alcoholic fatty liver disease. Fatty-liver is diagnosed when there is macrovesicular steatosis with displacement of the nucleus to the edge of the cell and at least 5% of the hepatocytes are seen to contain fat (4).Steatosis represents fat accumulation in liver tissue without inflammation. However, it is only called non-alcoholic fatty liver disease when alcohol - >20gms-30gms per day (5), has been excluded from the diet. Both non-alcoholic and alcoholic fatty liver are identical on histology. (4).LFT’s are indicative, not diagnostic. They indicate that a condition may be present but they are unable to diagnosis what the condition is. When a patient presents with raised fasting blood glucose, low HDL (high density lipoprotein), and elevated fasting triacylglycerols they are likely to have NAFLD. (6) Of the imaging techniques MRI is the least variable and the most reproducible. With CT scanning liver fat content can be semi quantitatively estimated. With increasing hepatic steatosis, liver attenuation values decrease by 1.6 Hounsfield units for every milligram of triglyceride deposited per gram of liver tissue (7). Ultrasound permits early detection of fatty liver, often in the preclinical stages before symptoms are present and serum alterations occur. Earlier, accurate reporting of this condition will allow appropriate intervention resulting in better patient health outcomes. References 1. Chalasami N. Does fat alone cause significant liver disease: It remains unclear whether simple steatosis is truly benign. American Gastroenterological Association Perspectives, February/March 2008 www.gastro.org/wmspage.cfm?parm1=5097 Viewed 20th October, 2008 2. Booth, M. George, J.Denney-Wilson, E: The population prevalence of adverse concentrations with adiposity of liver tests among Australian adolescents. Journal of Paediatrics and Child Health.2008 November 3. Catalano, D, Trovato, GM, Martines, GF, Randazzo, M, Tonzuso, A. Bright liver, body composition and insulin resistance changes with nutritional intervention: a follow-up study .Liver Int.2008; February 1280-9 4. Choudhury, J, Sanysl, A. Clinical aspects of Fatty Liver Disease. Semin in Liver Dis. 2004:24 (4):349-62 5. Dionysus Study Group. Drinking factors as cofactors of risk for alcohol induced liver change. Gut. 1997; 41 845-50 6. Preiss, D, Sattar, N. Non-alcoholic fatty liver disease: an overview of prevalence, diagnosis, pathogenesis and treatment considerations. Clin Sci.2008; 115 141-50 7. American Gastroenterological Association. Technical review on nonalcoholic fatty liver disease. Gastroenterology.2002; 123: 1705-25

Relevância:

10.00% 10.00%

Publicador:

Resumo:

In Australia, advertising is a $13 billion industry which needs a supply of suitably skilled employees. Over the years, advertising education has developed from vocational based courses to degree courses across the country. This paper uses diffusion theory and various secondary sources and interviews to observe the development of advertising education in Australia from its early past, to its current day tertiary offerings, to discussing the issues that are arising in the near future. Six critical issues are identified, along with observations about the challenges and opportunities within Australia advertising education. By looking back to the future, it is hoped that this historical review provides lessons for other countries of similar educational structure or background, or even other marketing communication disciplines on a similar evolutionary path.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

This dissertation by publication which focuses on gender and the Australian federal parliament has resulted in the submission of three refereed journal articles. Data for the study were obtained from 30 semi-structured interviews undertaken in 2006 with fifteen (15) male and fifteen (15) female members of the Australian parliament. The first of the articles is methodological and has been accepted for publication in the Australian Journal of Political Science. The paper argues that feminist political science is guided by five important principles. These are placing gender at the centre of the research, giving emphasis to women’s voice, challenging the public/private divide, using research to transform society and taking a reflexive approach to positionality. It is the latter principle, that of the importance of taking a reflexive approach to research which I explore in the paper. Through drawing on my own experiences as a member of the House of Representatives (Forde 1987-1996) I reflexively investigate the intersections between my background and my identity as a researcher. The second of the articles views the data through the lens of Acker’s (1990) notion of the ‘gendered organization’ which posits that there are four dimensions by which organizations are gendered. These are via the division of labour, through symbols, images and ideologies, by workplace interactions and through the gendered components of individual identity. In this paper which has been submitted to the British Journal of Political Science, each of Acker’s (1990) dimensions is examined in terms of the data from interviews with male and female politicians. The central question investigated is thus to what extent does the Australian parliament conform to Acker’s (1990) concept of the ‘gendered organization’? The third of the papers focuses specifically on data from interviews with the 15 male politicians and investigates how they view gender equality and the Australian parliament. The article, which has been submitted to the European Journal of Political Science asks to what extent contemporary male politicians view the Australian parliament as gendered? Discourse analysis that is ‘ways of viewing’ (Bacchi, 1999, p. 40) is used as an approach to analyse the data. Three discursive frameworks by which male politicians view gender in the Australian parliament are identified. These are: that the parliament is gendered as masculine but this is unavoidable; that the parliament is gendered as feminine and women are actually advantaged; and that the parliament is gender neutral and gender is irrelevant. It is argued that collectively these framing devices operate to mask the many constraints which exist to marginalise women from political participation and undermine attempts to address women’s political disadvantage as political participants. The article concludes by highlighting the significance of the paper beyond the Australian context and calling for further research which names and critiques political men and their discourses on gender and parliamentary practices and processes.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

The work of Italian-based photo-artist Patrick Nicholas is analysed to show how his re-workings of classic ‘old-master’ paintings can be seen as the art of ‘redaction,’ shedding new light on the relationship between originality and copying. I argue that redactional creativity is both highly productive of new meanings and a reinvention of the role of the medieval Golden Legend. (Lives of the Saints).

Relevância:

10.00% 10.00%

Publicador:

Resumo:

This is the lead article for an issue of M/C Journal on the theme ‘obsolete.’ It uses the history of the International Journal of Cultural Studies (of which the author has been editor since 1997) to investigate technological innovations and their scholarly implications in academic journal publishing; in particular the obsolescence of the print form. Print-based elements like cover-design, the running order of articles, special issues, refereeing and the reading experience are all rendered obsolete with the growth of online access to individual articles. The paper argues that individuation of reading choices may be accompanied by less welcome tendencies, such as a decline in collegiality, disciplinary innovation, and trust.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Some Engineering Faculties are turning to the problem-based learning (PBL)paradigm to engender necessary skills and competence in their graduates. Since, at the same time, some Faculties are moving towards distance education, questions are being asked about the effectiveness of PBL for technical fields such as Engineering when delivered in virtual space. This paper outlines an investigation of how student attributes affect their learning experience in PBL courses offered in virtual space. A frequency distribution was superimposed on the outcome space of a phenomenographical study on a suitable PBL course to investigate the effect of different student attributes on the learning experience. It was discovered that the quality, quantity, and style of facilitator interaction had the greatest impact on the student learning experience. This highlights the need to establish consistent student interaction plans and to set, and ensure compliance with, minimum standards with respect to facilitation and student interactions.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Amphibian is an 10’00’’ musical work which explores new musical interfaces and approaches to hybridising performance practices from the popular music, electronic dance music and computer music traditions. The work is designed to be presented in a range of contexts associated with the electro-acoustic, popular and classical music traditions. The work is for two performers using two synchronised laptops, an electric guitar and a custom designed gestural interface for vocal performers - the e-Mic (Extended Mic-stand Interface Controller). This interface was developed by one of the co-authors, Donna Hewitt. The e-Mic allows a vocal performer to manipulate the voice in real time through the capture of physical gestures via an array of sensors - pressure, distance, tilt - along with ribbon controllers and an X-Y joystick microphone mount. Performance data are then sent to a computer, running audio-processing software, which is used to transform the audio signal from the microphone. In this work, data is also exchanged between performers via a local wireless network, allowing performers to work with shared data streams. The duo employs the gestural conventions of guitarist and singer (i.e. 'a band' in a popular music context), but transform these sounds and gestures into new digital music. The gestural language of popular music is deliberately subverted and taken into a new context. The piece thus explores the nexus between the sonic and performative practices of electro acoustic music and intelligent electronic dance music (‘idm’). This work was situated in the research fields of new musical interfacing, interaction design, experimental music composition and performance. The contexts in which the research was conducted were live musical performance and studio music production. The work investigated new methods for musical interfacing, performance data mapping, hybrid performance and compositional practices in electronic music. The research methodology was practice-led. New insights were gained from the iterative experimental workshopping of gestural inputs, musical data mapping, inter-performer data exchange, software patch design, data and audio processing chains. In respect of interfacing, there were innovations in the design and implementation of a novel sensor-based gestural interface for singers, the e-Mic, one of the only existing gestural controllers for singers. This work explored the compositional potential of sharing real time performance data between performers and deployed novel methods for inter-performer data exchange and mapping. As regards stylistic and performance innovation, the work explored and demonstrated an approach to the hybridisation of the gestural and sonic language of popular music with recent ‘post-digital’ approaches to laptop based experimental music The development of the work was supported by an Australia Council Grant. Research findings have been disseminated via a range of international conference publications, recordings, radio interviews (ABC Classic FM), broadcasts, and performances at international events and festivals. The work was curated into the major Australian international festival, Liquid Architecture, and was selected by an international music jury (through blind peer review) for presentation at the International Computer Music Conference in Belfast, N. Ireland.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Key topics: Since the birth of the Open Source movement in the mid-80's, open source software has become more and more widespread. Amongst others, the Linux operating system, the Apache web server and the Firefox internet explorer have taken substantial market shares to their proprietary competitors. Open source software is governed by particular types of licenses. As proprietary licenses only allow the software's use in exchange for a fee, open source licenses grant users more rights like the free use, free copy, free modification and free distribution of the software, as well as free access to the source code. This new phenomenon has raised many managerial questions: organizational issues related to the system of governance that underlie such open source communities (Raymond, 1999a; Lerner and Tirole, 2002; Lee and Cole 2003; Mockus et al. 2000; Tuomi, 2000; Demil and Lecocq, 2006; O'Mahony and Ferraro, 2007;Fleming and Waguespack, 2007), collaborative innovation issues (Von Hippel, 2003; Von Krogh et al., 2003; Von Hippel and Von Krogh, 2003; Dahlander, 2005; Osterloh, 2007; David, 2008), issues related to the nature as well as the motivations of developers (Lerner and Tirole, 2002; Hertel, 2003; Dahlander and McKelvey, 2005; Jeppesen and Frederiksen, 2006), public policy and innovation issues (Jullien and Zimmermann, 2005; Lee, 2006), technological competitions issues related to standard battles between proprietary and open source software (Bonaccorsi and Rossi, 2003; Bonaccorsi et al. 2004, Economides and Katsamakas, 2005; Chen, 2007), intellectual property rights and licensing issues (Laat 2005; Lerner and Tirole, 2005; Gambardella, 2006; Determann et al., 2007). A major unresolved issue concerns open source business models and revenue capture, given that open source licenses imply no fee for users. On this topic, articles show that a commercial activity based on open source software is possible, as they describe different possible ways of doing business around open source (Raymond, 1999; Dahlander, 2004; Daffara, 2007; Bonaccorsi and Merito, 2007). These studies usually look at open source-based companies. Open source-based companies encompass a wide range of firms with different categories of activities: providers of packaged open source solutions, IT Services&Software Engineering firms and open source software publishers. However, business models implications are different for each of these categories: providers of packaged solutions and IT Services&Software Engineering firms' activities are based on software developed outside their boundaries, whereas commercial software publishers sponsor the development of the open source software. This paper focuses on open source software publishers' business models as this issue is even more crucial for this category of firms which take the risk of investing in the development of the software. Literature at last identifies and depicts only two generic types of business models for open source software publishers: the business models of ''bundling'' (Pal and Madanmohan, 2002; Dahlander 2004) and the dual licensing business models (Välimäki, 2003; Comino and Manenti, 2007). Nevertheless, these business models are not applicable in all circumstances. Methodology: The objectives of this paper are: (1) to explore in which contexts the two generic business models described in literature can be implemented successfully and (2) to depict an additional business model for open source software publishers which can be used in a different context. To do so, this paper draws upon an explorative case study of IdealX, a French open source security software publisher. This case study consists in a series of 3 interviews conducted between February 2005 and April 2006 with the co-founder and the business manager. It aims at depicting the process of IdealX's search for the appropriate business model between its creation in 2000 and 2006. This software publisher has tried both generic types of open source software publishers' business models before designing its own. Consequently, through IdealX's trials and errors, I investigate the conditions under which such generic business models can be effective. Moreover, this study describes the business model finally designed and adopted by IdealX: an additional open source software publisher's business model based on the principle of ''mutualisation'', which is applicable in a different context. Results and implications: Finally, this article contributes to ongoing empirical work within entrepreneurship and strategic management on open source software publishers' business models: it provides the characteristics of three generic business models (the business model of bundling, the dual licensing business model and the business model of mutualisation) as well as conditions under which they can be successfully implemented (regarding the type of product developed and the competencies of the firm). This paper also goes further into the traditional concept of business model used by scholars in the open source related literature. In this article, a business model is not only considered as a way of generating incomes (''revenue model'' (Amit and Zott, 2001)), but rather as the necessary conjunction of value creation and value capture, according to the recent literature about business models (Amit and Zott, 2001; Chresbrough and Rosenblum, 2002; Teece, 2007). Consequently, this paper analyses the business models from these two components' point of view.