917 resultados para field of solenoid
Resumo:
There is ongoing and wide-ranging dispute over the proliferation of childhood behaviour disorders. In particular, the veracity of the category Attention Deficit Hyperactivity Disorder (ADHD), has been the subject of considerable scepticism. With no end to the debate in sight, it will be argued here that the problem might effectively be approached, not by addressing the specific features of ADHD itself, but rather by a philosophical analysis of one of the terms around which this entire problem revolves: that is, the notion of truth. If we state: “It is true that ADHD is a real disorder”, what exactly do we mean? Do we mean that it is an objective fact of nature? Do we mean that it fits seamlessly with other sets of ideas and explanations? Or do we simply mean that it works as an idea in a practical sense? This paper will examine the relationship between some of the dominant models of truth, and the assertions made by those in the field of ADHD. Specifically, the paper will contrast the claim that ADHD is a real disorder, with the claim that ADHD is a product of social governance. The intention is, first, to place some significant qualifications upon the validity of the truth-claims made by ADHD advocates, and second, to re-emphasise the potential and promise of philosophical investigation in providing productive new ways of thinking about some obstinate and seemingly intractable educational problems.
Resumo:
This review explores the question whether chemometrics methods enhance the performance of electroanalytical methods. Electroanalysis has long benefited from the well-established techniques such as potentiometric titrations, polarography and voltammetry, and the more novel ones such as electronic tongues and noses, which have enlarged the scope of applications. The electroanalytical methods have been improved with the application of chemometrics for simultaneous quantitative prediction of analytes or qualitative resolution of complex overlapping responses. Typical methods include partial least squares (PLS), artificial neural networks (ANNs), and multiple curve resolution methods (MCR-ALS, N-PLS and PARAFAC). This review aims to provide the practising analyst with a broad guide to electroanalytical applications supported by chemometrics. In this context, after a general consideration of the use of a number of electroanalytical techniques with the aid of chemometrics methods, several overviews follow with each one focusing on an important field of application such as food, pharmaceuticals, pesticides and the environment. The growth of chemometrics in conjunction with electronic tongue and nose sensors is highlighted, and this is followed by an overview of the use of chemometrics for the resolution of complicated profiles for qualitative identification of analytes, especially with the use of the MCR-ALS methodology. Finally, the performance of electroanalytical methods is compared with that of some spectrophotometric procedures on the basis of figures-of-merit. This showed that electroanalytical methods can perform as well as the spectrophotometric ones. PLS-1 appears to be the method of practical choice if the %relative prediction error of not, vert, similar±10% is acceptable.
Resumo:
Mosston & Ashworth‟s Spectrum of Teaching styles was first published in 1966 and is potentially the longest surviving model of teaching within the field of physical education. Its longevity and influence is surely testament to its value and influence. Many tools have also been developed through the years based on The Spectrum of Teaching Styles. In 2005 as part of a doctoral study, this tool was developed by the author, Dr Edwards and Dr Ashworth for researchers and teachers to identify which teaching styles were being utilised from The Spectrum when teaching physical education. It could also be utilised for self-assessment of the teaching styles and individual uses, or those who work with Physical Education Teacher Education courses. The development of this tool took approximately 4 months, numerous emails and meetings. This presentation will outline this process, along with the reasons why such a tool was developed and the differences between it and others like it.
Resumo:
The trans-locative potential of the Internet has driven the design of many online applications. Online communities largely cluster around topics of interest, which take precedence over participants’ geographical locations. The site of production is often disregarded when creative content appears online. However, for some, a sense of place is a defining aspect of creativity. Yet environments that focus on the display and sharing of regionally situated content have, so far, been largely overlooked. Recent developments in geo-technologies have precipitated the emergence of a new field of interactive media. Entitled locative media, it emphasizes the geographical context of media. This paper argues that we might combine practices of locative media (experiential mapping and geo-spatial annotation) with aspects of online participatory culture (uploading, file-sharing and search categorization) to produce online applications that support geographically ‘located’ communities. It discusses the design considerations and possibilities of this convergence,making reference to an example, OurPlace 3G to 3D, which has to date been developed as a prototype.1 It goes on to discuss the benefits and potential uses of such convergent applications, including the co-production of spatial- emporal narratives of place.
Resumo:
The emergent field of practice-led research is a unique research paradigm that situates creative practice as both a driver and outcome of the research process. The exegesis that accompanies the creative practice in higher research degrees remains open to experimentation and discussion around what content should be included, how it should be structured, and its orientations. This paper contributes to this discussion by reporting on a content analysis of a large, local sample of exegeses. We have observed a broad pattern in contents and structure within this sample. Besides the introduction and conclusion, it has three main parts: situating concepts (conceptual definitions and theories), practical contexts (precedents in related practices), and new creations (the creative process, the artifacts produced and their value as research). This model appears to combine earlier approaches to the exegesis, which oscillated between academic objectivity in providing a context for the practice and personal reflection or commentary upon the creative practice. We argue that this hybrid or connective model assumes both orientations and so allows the researcher to effectively frame the practice as a research contribution to a wider field while doing justice to its invested poetics.
Resumo:
Introduction: The Google Online Marketing Challenge is a global competition in which student teams run advertising campaigns for small and medium-sized businesses (SMEs) using AdWords, Google’s text-based advertisements. In 2008, its inaugural year, over 8,000 students and 300 instructors from 47 countries representing over 200 schools participated. The Challenge ran in undergraduate and graduate classes in disciplines such as marketing, tourism, advertising, communication and information systems. Combining advertising and education, the Challenge gives student hands-on experience in the increasingly important field of online marketing, engages them with local businesses and motivates them through the thrill of a global competition. Student teams receive US$200 in AdWords credits, Google’s premier advertising product that offers cost-per-click advertisements. The teams then recruit and work with a local business to devise an effective online marketing campaign. Students first outline a strategy, run a series of campaigns, and provide their business with recommendations to improve their online marketing. Teams submit two written reports for judging by 14 academics in eight countries. In addition, Google AdWords experts judge teams on their campaign statistics such as success metrics and account management. Rather than a marketing simulation against a computer or hypothetical marketing plans for hypothetical businesses, the Challenges has student teams develop and manage real online advertising campaigns for their clients and compete against peers globally.
Resumo:
We examined differences in response latencies obtained during a validated video-based hazard perception driving test between three healthy, community-dwelling groups: 22 mid-aged (35-55 years), 34 young-old (65-74 years), and 23 old-old (75-84 years) current drivers, matched for gender, education level, and vocabulary. We found no significant difference in performance between mid-aged and young-old groups, but the old-old group was significantly slower than the other two groups. The differences between the old-old group and the other groups combined were independently mediated by useful field of view (UFOV), contrast sensitivity, and simple reaction time measures. Given that hazard perception latency has been linked with increased crash risk, these results are consistent with the idea that increased crash risk in older adults could be a function of poorer hazard perception, though this decline does not appear to manifest until age 75+ in healthy drivers.
Resumo:
This thesis argues that the end of Soviet Marxism and a bipolar global political imaginary at the dissolution of the short Twentieth Century poses an obstacle for anti-systemic political action. Such a blockage of alternate political imaginaries can be discerned by reading the work of Francis Fukuyama and "Endism" as performative invocations of the closure of political alternatives, and thus as an ideological proclamation which enables and constrains forms of social action. It is contended that the search through dialectical thought for a competing universal to posit against "liberal democracy" is a fruitless one, because it reinscribes the terms of teleological theories of history which work to effect closure. Rather, constructing a phenomenological analytic of the political conjuncture, the thesis suggests that the figure of messianism without a Messiah is central to a deconstructive reframing of the possibilities of political action - a reframing attentive to the rhetorical tone of texts. The project of recovering the political is viewed through a phenomenological lens. An agonistic political distinction must be made so as to memorialise the remainders and ghosts of progress, and thus to gesture towards an indeconstructible justice which would serve as a horizon for the articulation of an empty universal. This project is furthered by a return to a certain phenomenology inspired by Cornelius Castoriadis, Claude Lefort, Maurice Merleau-Ponty and Ernesto Laclau. The thesis provides a reading of Jacques Derrida and Walter Benjamin as thinkers of a minor universalism, a non-prescriptive utopia, and places their work in the context of new understandings of religion and the political as quasi-transcendentals which can be utilised to think through the aporias of political time in order to grasp shards of meaning. Derrida and Chantal Mouffe's deconstructive critique and supplement to Carl Schmitt's concept of the political is read as suggestive of a reframing of political thought which would leave the political question open and thus enable the articulation of social imaginary significations able to inscribe meaning in the field of political action. Thus, the thesis gestures towards a form of thought which enables rather than constrains action under the sign of justice.
Resumo:
This article explores the opportunities and challenges surrounding a viable cross-disciplinary engagement between the Humanities disciplines and the Creative Practice disciplines within the innovative context of the Creative Industries Faculty at the Queensland University of Technology. This will involve a charting of the intersection of the emerging field of creative practice-led research with various disciplines in the Humanities such as cultural studies. The potential for a reciprocal, transformative process in these creative fields will be addressed. Several examples of postgraduate students’ research will be foregrounded as case studies of the issues involved in fostering a genuine cultural critique both within and through creative practice. Some observers may argue that the research higher degree creative practitioner in effect looks outward from the symbolic material forms being created, in search of an interpretative paradigm, thereby trawling the Humanities for a theory. Several current debates within the postgraduate research arena regarding the balance between the theoretical exegesis and the creative work (e.g. performance, drama, dance, visual art, creative writing, film and screen production, music, interactive media etc) will also be critically examined.
Resumo:
Malcolm Shepherd Knowles was a key writer and theorist in the field of adult education in the United States. He died in 1997 and left a large legacy of books and journal articles. This thesis traced the development of his thinking over the 46-year period from 1950 to 1995. It examined the 25 works authored, co-authored, edited, reissued and revised by him during that period. The writings were scrutinised using a literature research methodology to expose the theoretical content, and a history of thought lens to identify and account for the development of major ideas. The methodology enabled a gradual unfolding of the history. A broadly-consistent and sequential pattern of thought focusing on the notion of andragogy emerged. The study revealed that after the initial phases of exploratory thinking, Knowles developed a practical-theoretical framework he believed could function as a comprehensive theory of adult learning. As his thinking progressed, his theory developed into a unified framework for human resource development and, later, into a model for the development of self-directed lifelong learners. The study traced the development of Knowles’ thinking through the phases of thought, identified the writings that belonged within each phase and produced a series of diagrammatic representations showing the evolution of his conceptual framework. The production of a history of the development of Knowles’ thought is the major outcome of the study. In addition to plotting the narrative sequence of thought-events, the history helps to explicate the factors and conditions that influenced Knowles’ thinking and to show the interrelationships between ideas. The study should help practitioners in their use and appreciation of Knowles’ works.
Resumo:
The international focus on embracing daylighting for energy efficient lighting purposes and the corporate sector’s indulgence in the perception of workplace and work practice “transparency” has spurned an increase in highly glazed commercial buildings. This in turn has renewed issues of visual comfort and daylight-derived glare for occupants. In order to ascertain evidence, or predict risk, of these events; appraisals of these complex visual environments require detailed information on the luminances present in an occupant’s field of view. Conventional luminance meters are an expensive and time consuming method of achieving these results. To create a luminance map of an occupant’s visual field using such a meter requires too many individual measurements to be a practical measurement technique. The application of digital cameras as luminance measurement devices has solved this problem. With high dynamic range imaging, a single digital image can be created to provide luminances on a pixel-by-pixel level within the broad field of view afforded by a fish-eye lens: virtually replicating an occupant’s visual field and providing rapid yet detailed luminance information for the entire scene. With proper calibration, relatively inexpensive digital cameras can be successfully applied to the task of luminance measurements, placing them in the realm of tools that any lighting professional should own. This paper discusses how a digital camera can become a luminance measurement device and then presents an analysis of results obtained from post occupancy measurements from building assessments conducted by the Mobile Architecture Built Environment Laboratory (MABEL) project. This discussion leads to the important realisation that the placement of such tools in the hands of lighting professionals internationally will provide new opportunities for the lighting community in terms of research on critical issues in lighting such as daylight glare and visual quality and comfort.
Resumo:
The ways in which the "traditional" tension between words and artwork can be perceived has huge implications for understanding the relationship between critical or theoretical interpretation, art and practice, and research. Within the practice-led PhD this can generate a strange sense of disjuncture for the artist-researcher particularly when engaged in writing the exegesis. This paper aims to explore this tension through an introductory investigation of the work of the philosopher Andrew Benjamin. For Benjamin criticism completes the work of art. Criticism is, with the artwork, at the centre of our experience and theoretical understanding of art – in this way the work of art and criticism are co-productive. The reality of this co-productivity can be seen in three related articles on the work of American painter Marcia Hafif. In each of these articles there are critical negotiations of just how the work of art operates as art and theoretically, within the field of art. This focus has important ramifications for the writing and reading of the exegesis within the practice-led research higher degree. By including art as a significant part of the research reporting process the artist-researcher is also staking a claim as to the critical value of their work. Rather than resisting the tension between word and artwork the practice-led artist-researcher need to embrace the co-productive nature of critical word and creative work to more completely articulate their practice and its significance as research. The ideal venue and opportunity for this is the exegesis.
What are students' understandings of how digital tools contribute to learning in design disciplines?
Resumo:
Building Information Modelling (BIM) is evolving in the Construction Industry as a successor to CAD. CAD is mostly a technical tool that conforms to existing industry practices, however BIM has the capacity to revolutionise industry practice. Rather than producing representations of design intent, BIM produces an exact Virtual Prototype of any building that in an ideal situation is centrally stored and freely exchanged between the project team, facilitating collaboration and allowing experimentation in design. Exposing design students to this technology through their formal studies allows them to engage with cutting edge industry practices and to help shape the industry upon their graduation. Since this technology is relatively new to the construction industry, there are no accepted models for how to “teach” BIM effectively at university level. Developing learning models to enable students to make the most out of their learning with BIM presents significant challenges to those teaching in the field of design. To date there are also no studies of students experiences of using this technology. This research reports on the introduction of Building Information Modeling (BIM) software into a second year Bachelor of Design course. This software has the potential to change industry standards through its ability to revolutionise the work practices of those involved in large scale design projects. Students’ understandings and experiences of using the software in order to complete design projects as part of their assessment are reported here. In depth semi-structured interviews with 6 students revealed that students had views that ranged from novice to sophisticate about the software. They had variations in understanding of how the software could be used to complete course requirements, to assist with the design process and in the workplace. They had engaged in limited exploration of the collaborative potential of the software as a design tool. Their understanding of the significance of BIM for the workplace was also variable. The results indicate that students are beginning to develop an appreciation for how BIM could aid or constrain the work of designers, but that this appreciation is highly varied and likely to be dependent on the students’ previous experiences of working in a design studio environment. Their range of understandings of the significance of the technology is a reflection of their level of development as designers (they are “novice” designers). The results also indicate that there is a need for subjects in later years of the course that allow students to specialise in the area of digital design and to develop more sophisticated views of the role of technology in the design process. There is also a need to capitalise on the collaborative potential inherent in the software in order to realise its capability to streamline some aspects of the design process. As students become more sophisticated designers we should explore their understanding of the role of technology as a design tool in more depth in order to make recommendations for improvements to teaching and learning practice related to BIM and other digital design tools.
Resumo:
There is not a single, coherent, jurisprudence for civil society organisations. Pressure for a clearly enuciated body of law applying to the whole of this sector of society continues to increase. The rise of third sector scholarship, the retreat of the welfare state, the rediscovery of the concept of civil society and pressures to strengthen social capital have all contributed to an ongoing stream of inquiry into the laws that regulate and favour civil society organisations. There have been almost thirty inquiries over the last sixty years into the doctrine of charitable purpose in common law countries. Those inquiries have established that problems with the law applying to civil society organisations are rooted in the common law adopting a ‘technical’ definition of charitable purpose and the failure of this body of law to develop in response to societal changes. Even though it is now well recognised that problems with law reform stem from problems inherent in the doctrine of charitable purpose, statutory reforms have merely ‘bolted on’ additions to the flawed ‘technical’ definition. In this way the scope of operation of the law has been incrementally expanded to include a larger number of civil society organisations. This piecemeal approach continues the exclusion of most civil society organisations from the law of charities discourse, and fails to address the underlying jurisprudential problems. Comprehensive reform requires revisiting the foundational problems embedded in the doctrine of charitable purpose, being informed by recent scholarship, and a paradigm shift that extends the doctrine to include all civil society organisations. Scholarly inquiry into civil society organisations, particularly from within the discipline of neoclassical economics, has elucidated insights that can inform legal theory development. This theory development requires decoupling the two distinct functions performed by the doctrine of charitable purpose which are: setting the scope of regulation, and determining entitlement to favours, such as tax exemption. If the two different functions of the doctrine are considered separately in the light of theoretical insights from other disciplines, the architecture for a jurisprudence emerges that facilitates regulation, but does not necessarily favour all civil society organisations. Informed by that broader discourse it is argued that when determining the scope of regulation, civil society organisations are identified by reference to charitable purposes that are not technically defined. These charitable purposes are in essence purposes which are: Altruistic, for public Benefit, pursued without Coercion. These charitable puposes differentiate civil society organisations from organisations in the three other sectors namely; Business, which is manifest in lack of altruism; Government, which is characterised by coercion; and Family, which is characterised by benefits being private not public. When determining entitlement to favour, it is theorised that it is the extent or nature of the public benefit evident in the pursuit of a charitable purpose that justifies entitlement to favour. Entitlement to favour based on the extent of public benefit is the theoretically simpler – the greater the public benefit the greater the justification for favour. To be entitled to favour based on the nature of a purpose being charitable the purpose must fall within one of three categories developed from the first three heads of Pemsel’s case (the landmark categorisation case on taxation favour). The three categories proposed are: Dealing with Disadvantage, Encouraging Edification; and Facilitating Freedom. In this alternative paradigm a recast doctrine of charitable purpose underpins a jurisprudence for civil society in a way similar to the way contract underpins the jurisprudence for the business sector, the way that freedom from arbitrary coercion underpins the jurisprudence of the government sector and the way that equity within families underpins succession and family law jurisprudence for the family sector. This alternative architecture for the common law, developed from the doctrine of charitable purpose but inclusive of all civil society purposes, is argued to cover the field of the law applying to civil society organisations and warrants its own third space as a body of law between public law and private law in jurisprudence.
Resumo:
Cardiovascular diseases refer to the class of diseases that involve the heart or blood vessels (arteries and veins). Examples of medical devices for treating the cardiovascular diseases include ventricular assist devices (VADs), artificial heart valves and stents. Metallic biomaterials such as titanium and its alloy are commonly used for ventricular assist devices. However, titanium and its alloy show unacceptable thrombosis, which represents a major obstacle to be overcome. Polyurethane (PU) polymer has better blood compatibility and has been used widely in cardiovascular devices. Thus one aim of the project was to coat a PU polymer onto a titanium substrate by increasing the surface roughness, and surface functionality. Since the endothelium of a blood vessel has the most ideal non-thrombogenic properties, it was the target of this research project to grow an endothelial cell layer as a biological coating based on the tissue engineering strategy. However, seeding endothelial cells on the smooth PU coating surfaces is problematic due to the quick loss of seeded cells which do not adhere to the PU surface. Thus it was another aim of the project to create a porous PU top layer on the dense PU pre-layer-coated titanium substrate. The method of preparing the porous PU layer was based on the solvent casting/particulate leaching (SCPL) modified with centrifugation. Without the step of centrifugation, the distribution of the salt particles was not uniform within the polymer solution, and the degree of interconnection between the salt particles was not well controlled. Using the centrifugal treatment, the pore distribution became uniform and the pore interconnectivity was improved even at a high polymer solution concentration (20%) as the maximal salt weight was added in the polymer solution. The titanium surfaces were modified by alkli and heat treatment, followed by functionlisation using hydrogen peroxide. A silane coupling agent was coated before the application of the dense PU pre-layer and the porous PU top layer. The ability of the porous top layer to grow and retain the endothelial cells was also assessed through cell culture techniques. The bonding strengths of the PU coatings to the modified titanium substrates were measured and related to the surface morphologies. The outcome of the project is that it has laid a foundation to achieve the strategy of endothelialisation for the blood compatibility of medical devices. This thesis is divided into seven chapters. Chapter 2 describes the current state of the art in the field of surface modification in cardiovascular devices such as ventricular assist devices (VADs). It also analyses the pros and cons of the existing coatings, particularly in the context of this research. The surface coatings for VADs have evolved from early organic/ inorganic (passive) coatings, to bioactive coatings (e.g. biomolecules), and to cell-based coatings. Based on the commercial applications and the potential of the coatings, the relevant review is focused on the following six types of coatings: (1) titanium nitride (TiN) coatings, (2) diamond-like carbon (DLC) coatings, (3) 2-methacryloyloxyethyl phosphorylcholine (MPC) polymer coatings, (4) heparin coatings, (5) textured surfaces, and (6) endothelial cell lining. Chapter 3 reviews the polymer scaffolds and one relevant fabrication method. In tissue engineering, the function of a polymeric material is to provide a 3-dimensional architecture (scaffold) which is typically used to accommodate transplanted cells and to guide their growth and the regeneration of tissue. The success of these systems is dependent on the design of the tissue engineering scaffolds. Chapter 4 describes chemical surface treatments for titanium and titanium alloys to increase the bond strength to polymer by altering the substrate surface, for example, by increasing surface roughness or changing surface chemistry. The nature of the surface treatment prior to bonding is found to be a major factor controlling the bonding strength. By increasing surface roughness, an increase in surface area occurs, which allows the adhesive to flow in and around the irregularities on the surface to form a mechanical bond. Changing surface chemistry also results in the formation of a chemical bond. Chapter 5 shows that bond strengths between titanium and polyurethane could be significantly improved by surface treating the titanium prior to bonding. Alkaline heat treatment and H2O2 treatment were applied to change the surface roughness and the surface chemistry of titanium. Surface treatment increases the bond strength by altering the substrate surface in a number of ways, including increasing the surface roughness and changing the surface chemistry. Chapter 6 deals with the characterization of the polyurethane scaffolds, which were fabricated using an enhanced solvent casting/particulate (salt) leaching (SCPL) method developed for preparing three-dimensional porous scaffolds for cardiac tissue engineering. The enhanced method involves the combination of a conventional SCPL method and a step of centrifugation, with the centrifugation being employed to improve the pore uniformity and interconnectivity of the scaffolds. It is shown that the enhanced SCPL method and a collagen coating resulted in a spatially uniform distribution of cells throughout the collagen-coated PU scaffolds.In Chapter 7, the enhanced SCPL method is used to form porous features on the polyurethane-coated titanium substrate. The cavities anchored the endothelial cells to remain on the blood contacting surfaces. It is shown that the surface porosities created by the enhanced SCPL may be useful in forming a stable endothelial layer upon the blood contacting surface. Chapter 8 finally summarises the entire work performed on the fabrication and analysis of the polymer-Ti bonding, the enhanced SCPL method and the PU microporous surface on the metallic substrate. It then outlines the possibilities for future work and research in this area.