829 resultados para haptic essence
Resumo:
This study examines the context of coordinated responses, triggers for coordinated responses, and preference for or choice of coordinating strategies in road traffic injury prevention at a local level in some OECD countries. This aim is achieved through a mixed-methodology. In this respect, 22 semi-structured interviews were conducted with road traffic injury prevention experts from five OECD countries. In addition, 31 professional road traffic injury prevention stakeholders from seven OECD nations completed a self-administered, online survey. It found that there was resource limitation and inter-dependence across actors within the context of road traffic injury prevention at a local level. Furthermore, this study unveiled the realization of resource-dependency as a trigger for coordinated responses at a local level. Moreover, the present examination has revealed two coordinating strategies favored by experts in road traffic injury prevention – i.e. self-organizing community groups, which are deemed to have a platform to deliver programs within communities, and the funding of community groups to forge partnerships. However, the present study did not appear to endorse other strategies such as the formalization of coordinated responses or a legal mandate to coordinate responses. In essence, this study appears to suggest a need to manage coordinated responses from an adaptive perspective with interactions across road traffic injury prevention programs being forged on a mutual understanding of inter-dependency arising out of resource scarcity. In fact, the role of legislation and top-down national models in local level management of coordinated responses is likely to be one of identifying opportunities to interact with self-organized community groups and fund partnership-based road traffic injury prevention events.
Resumo:
Introduction Different types of hallucinations are symptomatic of different conditions. Schizotypal hallucinations are unique in that they follow existing delusional narrative patterns: they are often bizarre, they are generally multimodal, and they are particularly vivid (the experience of a newsreader abusing you personally over the TV is both visual and aural. Patients who feel and hear silicone chips under their skin suffer from haptic hallucinations as well as aural ones, etc.) Although there are a number of hypotheses for hallucinations, few cogently grapple the sheer bizarreness of the ones experienced in schizotypal psychosis. Methods A review-based hypothesis, traversing theory from the molecular level to phenomenological expression as a distinct and recognizable symptomatology. Conclusion Hallucinations appear to be caused by a two-fold dysfunction in the mesofrontal dopamine pathway, which is considered here to mediate attention of different types: in the anterior medial frontal lobe, the receptors (largely D1 type) mediate declarative awareness, whereas the receptors in the striatum (largely D2 type) mediate latent awareness of known schemata. In healthy perception, most of the perceptual load is performed by the latter: by the top-down predictive and mimetic engine, with the bottom-up mechanism being used as a secondary tool to bring conscious deliberation to stimuli that fails to match up against expectations. In schizophrenia, the predictive mode is over-stimulated, while the bottom-up feedback mechanism atrophies. The dysfunctional distribution pattern effectively confines dopamine activity to the striatum, thereby stimulating the structural components of thought and behaviour: well-learned routines, narrative structures, lexica, grammar, schemata, archetypes, and other procedural resources. Meanwhile, the loss of activity in the frontal complex reduces the capacity for declarative awareness and for processing anything that fails to meet expectations.
Resumo:
The importance of a thorough and systematic literature review has long been recognised across academic domains as critical to the foundation of new knowledge and theory evolution. Driven by an exponentially growing body of knowledge in the IS discipline, there has been a recent influx of guidance on how to conduct a literature review. As literature reviews are emerging as a standalone research method in itself, increasingly these method focused guidelines are of great interest, receiving acceptance at top tier IS publication outlets. Nevertheless, the finer details which offer justification for the selected content, and the effective presentation of supporting data has not been widely discussed in these method papers to date. This paper addresses this gap by exploring the concept of ‘literature profiling’ while arguing that it is a key aspect of a comprehensive literature review. The study establishes the importance of profiling for managing aspects such as quality assurance, transparency and the mitigation of selection bias. And then discusses how profiling can provide a valid basis for data analysis based on the attributes of selected literature. In essence, this study has conducted an archival analysis of literature (predominately from the IS domain) to present its main argument; the value for literature profiling, with supporting exemplary illustrations.
Resumo:
In 2009, the National Research Council of the National Academies released a report on A New Biology for the 21st Century. The council preferred the term ‘New Biology’ to capture the convergence and integration of the various disciplines of biology. The National Research Council stressed: ‘The essence of the New Biology, as defined by the committee, is integration—re-integration of the many sub-disciplines of biology, and the integration into biology of physicists, chemists, computer scientists, engineers, and mathematicians to create a research community with the capacity to tackle a broad range of scientific and societal problems.’ They define the ‘New Biology’ as ‘integrating life science research with physical science, engineering, computational science, and mathematics’. The National Research Council reflected: 'Biology is at a point of inflection. Years of research have generated detailed information about the components of the complex systems that characterize life––genes, cells, organisms, ecosystems––and this knowledge has begun to fuse into greater understanding of how all those components work together as systems. Powerful tools are allowing biologists to probe complex systems in ever greater detail, from molecular events in individual cells to global biogeochemical cycles. Integration within biology and increasingly fruitful collaboration with physical, earth, and computational scientists, mathematicians, and engineers are making it possible to predict and control the activities of biological systems in ever greater detail.' The National Research Council contended that the New Biology could address a number of pressing challenges. First, it stressed that the New Biology could ‘generate food plants to adapt and grow sustainably in changing environments’. Second, the New Biology could ‘understand and sustain ecosystem function and biodiversity in the face of rapid change’. Third, the New Biology could ‘expand sustainable alternatives to fossil fuels’. Moreover, it was hoped that the New Biology could lead to a better understanding of individual health: ‘The New Biology can accelerate fundamental understanding of the systems that underlie health and the development of the tools and technologies that will in turn lead to more efficient approaches to developing therapeutics and enabling individualized, predictive medicine.’ Biological research has certainly been changing direction in response to changing societal problems. Over the last decade, increasing awareness of the impacts of climate change and dwindling supplies of fossil fuels can be seen to have generated investment in fields such as biofuels, climate-ready crops and storage of agricultural genetic resources. In considering biotechnology’s role in the twenty-first century, biological future-predictor Carlson’s firm Biodesic states: ‘The problems the world faces today – ecosystem responses to global warming, geriatric care in the developed world or infectious diseases in the developing world, the efficient production of more goods using less energy and fewer raw materials – all depend on understanding and then applying biology as a technology.’ This collection considers the roles of intellectual property law in regulating emerging technologies in the biological sciences. Stephen Hilgartner comments that patent law plays a significant part in social negotiations about the shape of emerging technological systems or artefacts: 'Emerging technology – especially in such hotbeds of change as the life sciences, information technology, biomedicine, and nanotechnology – became a site of contention where competing groups pursued incompatible normative visions. Indeed, as people recognized that questions about the shape of technological systems were nothing less than questions about the future shape of societies, science and technology achieved central significance in contemporary democracies. In this context, states face ongoing difficulties trying to mediate these tensions and establish mechanisms for addressing problems of representation and participation in the sociopolitical process that shapes emerging technology.' The introduction to the collection will provide a thumbnail, comparative overview of recent developments in intellectual property and biotechnology – as a foundation to the collection. Section I of this introduction considers recent developments in United States patent law, policy and practice with respect to biotechnology – in particular, highlighting the Myriad Genetics dispute and the decision of the Supreme Court of the United States in Bilski v. Kappos. Section II considers the cross-currents in Canadian jurisprudence in intellectual property and biotechnology. Section III surveys developments in the European Union – and the interpretation of the European Biotechnology Directive. Section IV focuses upon Australia and New Zealand, and considers the policy responses to the controversy of Genetic Technologies Limited’s patents in respect of non-coding DNA and genomic mapping. Section V outlines the parts of the collection and the contents of the chapters.
Resumo:
Background Nicotiana benthamiana is an allo-tetraploid plant, which can be challenging for de novo transcriptome assemblies due to homeologous and duplicated gene copies. Transcripts generated from such genes can be distinct yet highly similar in sequence, with markedly differing expression levels. This can lead to unassembled, partially assembled or mis-assembled contigs. Due to the different properties of de novo assemblers, no one assembler with any one given parameter space can re-assemble all possible transcripts from a transcriptome. Results In an effort to maximise the diversity and completeness of de novo assembled transcripts, we utilised four de novo transcriptome assemblers, TransAbyss, Trinity, SOAPdenovo-Trans, and Oases, using a range of k-mer sizes and different input RNA-seq read counts. We complemented the parameter space biologically by using RNA from 10 plant tissues. We then combined the output of all assemblies into a large super-set of sequences. Using a method from the EvidentialGene pipeline, the combined assembly was reduced from 9.9 million de novo assembled transcripts to about 235,000 of which about 50,000 were classified as primary. Metrics such as average bit-scores, feature response curves and the ability to distinguish paralogous or homeologous transcripts, indicated that the EvidentialGene processed assembly was of high quality. Of 35 RNA silencing gene transcripts, 34 were identified as assembled to full length, whereas in a previous assembly using only one assembler, 9 of these were partially assembled. Conclusions To achieve a high quality transcriptome, it is advantageous to implement and combine the output from as many different de novo assemblers as possible. We have in essence taking the ‘best’ output from each assembler while minimising sequence redundancy. We have also shown that simultaneous assessment of a variety of metrics, not just focused on contig length, is necessary to gauge the quality of assemblies.
Resumo:
Findings from numerous quantitative studies suggest that spouses of patients undergoing Coronary Artery Bypass (CAB) surgery experience both physical and emotional stress before and after their partner's surgery. Such studies have contributed to our understanding of the spouses' experiences, however they have largely failed to capture the qualitative experience of what it is like to be a spouse of a partner who has undergone CAB surgery. The objective of this study was to describe the experience of spouses of patients who had recently undergone CAB surgery. This study was guided by Husserl's phenomenological approach to qualitative research. In accordance with the nature of phenomenological research the number of participants necessarily needs to be small because phenomenology values the unique experience of individuals. Therefore this study gathered data from four participants utilising open ended indepth interviews. The method of analysis was adapted from Amedeo Giorgi's five step empirical phenomenological process which brackets preconceived notions, reducing participants' accounts to the essential essence or meanings. Numerous themes common to each of the spouses emerged. These included: seeking information; the necessity for rapid decision making; playing guardian; a desire to debrief with their partner and lastly, uncertainty of their future role. This study has attempted to understand the phenomena of the spouse's experience and in doing so, believe that we now have a better understanding and insight into the needs of spouses of CAB surgery patients. This has added another dimension to our existing body of knowledge and further facilitates holistic patient care.
Resumo:
My thesis concerns the notion of existence as an encounter, as developed in the philosophy of Gilles Deleuze (1925 1995). What this denotes is a critical stance towards a major current in Western philosophical tradition which Deleuze nominates as representational thinking. Such thinking strives to provide a stable ground for identities by appealing to transcendent structures behind the apparent reality and explaining the manifest diversity of the given by such notions as essence, idea, God, or totality of the world. In contrast to this, Deleuze states that abstractions such as these do not explain anything, but rather that they need to be explained. Yet, Deleuze does not appeal merely to the given. He sees that one must posit a genetic element that accounts for experience, and this element must not be naïvely traced from the empirical. Deleuze nominates his philosophy as transcendental empiricism and he seeks to bring together the approaches of both empiricism and transcendental philosophy. In chapter one I look into the motivations of Deleuze s transcendental empiricism and analyse it as an encounter between Deleuze s readings of David Hume and Immanuel Kant. This encounter regards, first of all, the question of subjectivity and results in a conception of identity as non-essential process. A pre-given concept of identity does not explain the nature of things, but the concept itself must be explained. From this point of view, the process of individualisation must become the central concern. In chapter two I discuss Deleuze s concept of the affect as the basis of identity and his affiliation with the theories of Gilbert Simondon and Jakob von Uexküll. From this basis develops a morphogenetic theory of individuation-as-process. In analysing such a process of individuation, the modal category of the virtual becomes of great value, being an open, indeterminate charge of potentiality. As the virtual concerns becoming or the continuous process of actualisation, then time, rather than space, will be the privileged field of consideration. Chapter three is devoted to the discussion of the temporal aspect of the virtual and difference-without-identity. The essentially temporal process of subjectification results in a conception of the subject as composition: an assemblage of heterogeneous elements. Therefore art and aesthetic experience is valued by Deleuze because they disclose the construct-like nature of subjectivity in the sensations they produce. Through the domain of the aesthetic the subject is immersed in the network of affectivity that is the material diversity of the world. Chapter four addresses a phenomenon displaying this diversified indentity: the simulacrum an identity that is not grounded in an essence. Developed on the basis of the simulacrum, a theory of identity as assemblage emerges in chapter five. As the problematic of simulacra concerns perhaps foremost the artistic presentation, I shall look into the identity of a work of art as assemblage. To take an example of a concrete artistic practice and to remain within the problematic of the simulacrum, I shall finally address the question of reproduction particularly in the case recorded music and its identity regarding the work of art. In conclusion, I propose that by overturning its initial representational schema, phonographic music addresses its own medium and turns it into an inscription of difference, exposing the listener to an encounter with the virtual.
Resumo:
This study analyses British military planning and actions during the Suez Crisis in 1956. It seeks to find military reasons for the change of concepts during the planning and compares these reasons with the tactical doctrines of the time. The thesis takes extensive advantage of military documents preserved in the National Archives, London. In order to expand the understanding of the exchange of views during the planning process, the private papers of high ranking military officials have also been consulted. French military documents preserved in the Service Historique de la Defence, Paris, have provided an important point of comparison. The Suez Crisis caught the British armed forces in the middle of a transition phase. The main objective of the armed forces was to establish a credible deterrence against the Soviet Union. However, due to overseas commitments the Middle East playing a paramount role because of its economic importance the armed forces were compelled to also prepare for Limited War and the Cold War. The armed forces were not fully prepared to meet this demand. The Middle Eastern garrison was being re-organised after the withdrawal from the Canal Base and the concept for a strategic reserve was unimplemented. The tactical doctrines of the time were based on experiences from the Second World War. As a result, the British view of amphibious operations and the subsequent campaigns emphasised careful planning, mastery of the sea and the air, sufficient superiority in numbers and firepower, centralised command and extensive administrative preparations. The British military had realized that Nasser could nationalise the Suez Canal and prepared an outline plan to meet this contingency. Although the plan was nothing more than a concept, it was accepted as a basis for further planning when the Canal was nationalised at the end of July. This plan was short-lived. The nominated Task Force Commanders shifted the landing site from Port Said to Alexandria because it enabled faster expansion of the bridgehead. In addition, further operations towards Cairo the hub of Nasser s power would be easier to conduct. The operational concept can be described as being traditional and was in accordance with the amphibious warfare doctrine. This plan was completely changed at the beginning of September. Apparently, General Charles Keightley, the Commander-in-Chief, and the Chairman of the Chiefs of Staff Committee developed the idea of prolonged aerial operations. The essence of the concept was to break the Egyptian will to resist by attacking the oil facilities, the transportation system and the armed forces. This victory through air concept would be supported by carefully planned psychological operations. This concept was in accordance with the Royal Air Force doctrine, which promoted a bomber offensive against selected target categories. General Keightley s plan was accepted despite suspicions at every planning level. The Joint Planning Staff and the Task Force Commanders opposed the concept from the beginning to the end because of its unpredictability. There was no information that suggested the bombing would persuade the Egyptians to submit. This problem was worsened by the fact that British intelligence was unable to provide reliable strategic information. The Task Force Commanders, who were responsible for the tactical plans, were not able to change Keightley s mind, but the concept was expanded to include a traditional amphibious assault on Port Said due to their resistance. The bombing campaign was never tested as the Royal Air Force was denied authorisation to destroy the transportation and oil targets. The Chiefs of Staff and General Keightley were too slow to realise that the execution of the plan depended on the determination of the Prime Minister. However, poor health, a lack of American and domestic support and the indecisiveness of the military had ruined Eden s resolve. In the end, a very traditional amphibious assault, which was bound to succeed at the tactical level but fail at the strategic level, was launched against Port Said.
Resumo:
Tuure Junnila, PhD (1910-1999) was one of Finland's most renowned conservative politicians of the post-war period. Junnila is remembered primarily as a persistent opponent of Urho Kekkonen, a long-term Member of Parliament, a conspicuous opposition member and a prolific political writer. Junnila's ideologies and political views were conservative, and he is one of the most outstanding figures in the history of the National Coalition Party. Junnila also made an extensive career outside of politics, first as an economist and then as an executive of Finland's leading commercial bank Kansallis-Osake-Pankki. The Young Conservative is a partial biography written using traditional historical research methods, which examines Junnila's personal history and his activity in public life up to 1956. The study begins by investigating Junnila's background through his childhood, school years, university studies and early professional career. It also looks at Junnila's work as an economist and practical banker. Particular attention is paid to Junnila's political work, constantly focusing on the following five often overlapping areas: (1) economic policy, (2) domestic policy, (3) foreign and security policy, (4) Junnila and Urho Kekkonen, (5) Junnila, the Coalition Party and Finnish conservatism. In his economic policy, Junnila emphasised the importance of economic stability, opposed socialisation and the growth of public expenditure, defended the free market system and private entrepreneurship, and demanded tax cuts. This policy was very popular within the Coalition Party during the early 1950s, making Junnila the leading conservative economic politician of the time. In terms of domestic policy, Junnila demanded as early as the 1940s that a "third force" should be established in Finland to counterbalance the agrarian and labour parties by uniting conservative and liberal ideologies under the same roof. Foreign and security policy is the area of Junnila's political activity which is most clearly situated after the mid-1950s. However, Junnila's early speeches and writings already show a striving towards the unconditional neutrality modelled by Switzerland and Sweden and a strong emphasis on Finland's right to internal self-determination. Junnila, as did the Coalition Party as a whole, adopted a consistently critical approach towards Urho Kekkonen between 1951 and 1956, but this attitude was not as bluntly negative and all-round antagonistic as many previous studies have implied. Junnila was one of the leading Finnish conservatives of the early 1950s and in all essence his views were analogous to the general alignment of the Coalition Party at the time: conservative in ideology and general policy, and liberal in economic policy.
Resumo:
The aim of this study is to define and analyse the symbolism hidden in the gamelan music of the Central Javanese, especially in the Yogyakartanese wayang kulit shadow theatre. This dissertation is divided into two parts. The first part deals with the theory, history and practice of Central Javanese shadow theatre. It also presents the tone symbol theory on which this study is based of B. Y. H. Sastrapustaka, the court servant and musician of the sultan s palace of Yogyakarta. For historical comparison, other theories and phenomena that seem to have some connections with the previously mentioned tone symbol theory are presented here as well as the equipment of the shadow theatre, its music, musical instruments and the shadow theatre in general in literature. The theoretic-methodological basis of the study is an enlarged model of research of cultural music, in which a person in the centre of the model with his/her concepts and by his/her behaviour creates a work of art and receives criticism through feedback, while the process of reciprocal action dynamically affects the whole development of the culture in question. In connection with the concepts of the work of art, the manner of approach of this study is also semiotic as the tone symbol theory gives a particular meaning to each musical note. Thus the purpose of this study is to find answers to how the tone symbol theory manifests itself in practical music making, what its origin is, if it is well known or not, and whether shadow theatre music supports this theory. The second part of this dissertation deals with material collected through interviews and observations as well as representative samples of musical pieces for shadow theatre and their analyses. In relation to this a special tool for analysing gamelan music, developed for the purpose of this study, is also presented. Sufficiently versatile material on the essence and meaning of the shadow theatre collected from many puppet masters of an older generation, many of whom are no longer with us, constitutes an important part of this study. This study proves that the tone symbol theory of Sastrapustaka is of tantristic tradition from the Hindu-Javanese period before the 16th century and before the appearance of Islam in Java. The variants of the previously mentioned theory can be found also in other fields of Javanese advanced civilization, such as architecture and dance. But it seems that knowledge about the tone symbolism connected to the shadow theatre especially has only been preserved in the sultan s palace of Yogyakarta and its intimate circles. The outsider puppet masters surely follow the theory, but they do not necessarily know its origin. As a result of the musical analysis, it is obvious that the musical pieces used for the shadow theatre bear different kinds of symbolic meanings which only an initiated person can feel and understand. These meanings are closely related to the plot of the play at the moment.
Resumo:
By the end of the 18th century the daughters of the nobility in the northern parts of Europe received a quite different kind of education from their brothers. Although the cultural aims of the upbringing of girls were similar to that of boys, the practice of the raising of girls was less influenced by tradition. The education of boys was one of classical humanistic and military training, but the girls were more freely educated. The unity and exclusiveness of the culture of nobility were of great importance to the continued influence of this elite. The importance of education became even greater, partly because of the unstable political situation, and partly because of the changes the Enlightenment had caused in the perception of the human essence. The delicate and ambitious hônnete homme was expected to constantly strive to a greater perfection as a Christian. On the other hand, the great weight given to aesthetics - etiquette and taste - made individual variation of the contents of education possible. Education consisted mainly in aesthetic studies; girls studied music, dancing, fine arts, epistolary skills and also the art of polite conversation. On the other hand, there was a demand for enlightenment, and one often finds personal political and social ambitions, which made competition in all skills necessary for the daughters as well. Literary sources for the education of girls are Madame LePrince de Beaumont, Madame d'Epinay, Madame de Genlis and Charles Rollin. Other, perhaps even more important sources are the letters between parents and children and papers originating from studies. Diaries and memoirs also tell us about the practice of education in day to day life. The approach of this study is semiotic. It can be stated that the code of the culture was well hidden from the outsider. This was achieved, for instance, by the adaptation of the foreign French language and culture. The core of the culture consisted of texts which only thorough examples stated the norms which were expressed as good taste. Another important feature of the culture was its tendency towards theatricalisation. The way of life was dictated by taste, and moral values were included in the aesthetic norms through the constant striving for modesty. Pleasant manners were also correct in an ethical perspective. Morality could thus also be taught through etiquette.
Resumo:
Luce Irigaray is a Belgian-born philosopher, psychoanalyst and linguist. Irigaray s concept of woman is crucial for understanding her own work but also for examining and developing the theoretical and methodological basis of feminist theory. This thesis argues that, ultimately, Irigaray s exploration of woman s being challenges our traditional notion of philosophy as a neutral discourse and the traditional notion of ourselves as philosophizing persons or human beings. However, despite its crucial role, Irigaray s idea of woman still lacks a comprehensive explication. This is because the discourse of sexual difference is blurred by the ideas of essentialism and biologism. --- Irigaray s concept of woman has been interpreted and criticized from the perspectives of metaphysical essentialism, strategic essentialism, realist essentialism and deconstructionism. This thesis argues that a reinterpretation is necessary to account for Irigaray s claims about the the traditional woman , mimesis, the specificity of the feminine body, feminine expression and sexual difference. Moreover, any reading should account for the differences between women and avoid giving a prescriptive function to the essence of woman. --- My thesis develops a new interpretation of Irigaray s concept of woman on the basis of the phenomenology of the body. It argues that Irigaray s discourse on woman can and must be understood by an idea of existential style. Existential style is embodied, affective and spiritual and it is constituted in relation to oneself, to others and to the world. It is temporal, it evolves and changes but preserves its open unity in its transformations. Stylistic unities, such as femininity or philosophy, are constituted in and by the singulars. -- This study discusses and analyses feminine existential style as a central theme and topic of Irigaray s works and shows how her work operates as a primary and paradigmatic example of the feminine style. These tasks are performed by studying the mimetic positions available for women and by explicating the phenomenological background of Irigaray s conceptions of the philosophical method, and the lived, expressive and affective body. The critical occupation and transformation of these mimetic positions, the inquiry into the first-person pre-discursive experience, and the cultivation of feminine expressivity open up the possibility of becoming a woman writer, a woman lover and a woman philosopher. The appearance of these new feminine figures is a precondition for the realization of sexual difference. So Irigaray opens up the possibility of sexual difference by instituting and constituting a feminine subject of love and wisdom, and by problematizing the idea of a neutral and absolute subject.
Resumo:
This article explains the essence of the context-sensitive parameters and dimensions in play at the time of an intervention, through the application of Rog’s (2012) model of contextual parameters. Rog’s model offers evaluators a structured approach to examine an intervention. The initial study provided a systematic way to clarify the scope, variables, timing, and appropriate evaluation methodology to evaluate the implementation of a government policy. Given that the government implementation of an educational intervention under study did not follow the experimental research approach, nor the double cycle of action research approach, the application of Rog’s model provided an in-depth understanding of the context-sensitive environment; it is from this clear purpose that the broader evaluation was conducted. Overall, when governments or institutions implement policy to invoke educational change (and this intervention is not guided by an appropriate evaluation approach), then program evaluation is achievable post-implementation. In this situation, Rog’s (2012) model of contextual parameters is a useful way to achieve clarity of purpose to guide the program evaluation.
Resumo:
Artist statement – Artisan Gallery I have a confession to make… I don’t wear a FitBit, I don’t want an Apple Watch and I don’t like bling LED’s. But, what excites me is a future where ‘wearables’ are discreet, seamless and potentially one with our body. Burgeoning E-textiles research will provide the ability to inconspicuously communicate, measure and enhance human health and well-being. Alongside this, next generation wearables arguably will not be worn on the body, but rather within the body…under the skin. ‘Under the Skin’ is a polemic piece provoking debate on the future of wearables – a place where they are not overt, not auxiliary and perhaps not apparent. Indeed, a future where wearables are under the skin or one with our apparel. And, as underwear closets the skin and is the most intimate and cloaked apparel item we wear, this work unashamedly teases dialogue to explore how wearables can transcend from the overt to the unseen. Context Wearable Technology, also referred to as wearable computing or ‘wearables’, is an embryonic field that has the potential to unsettle conventional notions as to how technology can interact, enhance and augment the human body. Wearable technology is the next-generation for ubiquitous consumer electronics and ‘Wearables’ are, in essence, miniature electronic devices that are worn by a person, under clothing, embedded within clothing/textiles, on top of clothing, or as stand-alone accessories/devices. This wearables market is predicted to grow somewhere between $30-$50 billion in the next 5 years (Credit Suisse, 2013). The global ‘wearables’ market, which is emergent in phase, has forecasted predictions for vast consumer revenue with the potential to become a significant cross-disciplinary disruptive space for designers and entrepreneurs. For Fashion, the field of wearables is arguably at the intersection of the second and third generation for design innovation: the first phase being purely decorative with aspects such as LED lighting; the second phase consisting of an array of wearable devices, such as smart watches, to communicate areas such as health and fitness, the third phase involving smart electronics that are woven into the textile to perform a vast range of functions such as body cooling, fabric colour change or garment silhouette change; and the fourth phase where wearable devices are surgically implanted under the skin to augment, transform and enhance the human body. Whilst it is acknowledged the wearable phases are neither clear-cut nor discreet in progression and design innovation can still be achieved with first generation decorative approaches, the later generation of technology that is less overt and at times ‘under the skin’ provides a uniquely rich point for design innovation where the body and technology intersect as one. With this context in mind, the wearable provocation piece ‘Under the Skin’ provides a unique opportunity for the audience to question and challenge conventional notions that wearables need to be a: manifest in nature, b: worn on or next to the body, and c: purely functional. The piece ‘Under the Skin’ is informed by advances in the market place for wearable innovation, such as: the Australian based wearable design firm Catapult with their discreet textile biometric sports tracking innovation, French based Spinali Design with their UV app based textile senor to provide sunburn alerts, as well as opportunities for design technology innovation through UNICEF’s ‘Wearables for Good’ design challenge to improve the quality of life in disadvantaged communities. Exhibition As part of Artisan’s Wearnext exhibition, the work was on public display from 25 July to 7 November 2015 and received the following media coverage: WEARNEXT ONLINE LISTINGS AND MEDIA COVERAGE: http://indulgemagazine.net/wear-next/ http://www.weekendnotes.com/wear-next-exhibition-gallery-artisan/ http://concreteplayground.com/brisbane/event/wear-next_/ http://www.nationalcraftinitiative.com.au/news_and_events/event/48/wear-next http://bneart.com/whats-on/wear-next_/ http://creativelysould.tumblr.com/post/124899079611/creative-weekend-art-edition http://www.abc.net.au/radionational/programs/breakfast/smartly-dressed-the-future-of-wearable-technology/6744374 http://couriermail.newspaperdirect.com/epaper/viewer.aspx RADIO COVERAGE http://www.abc.net.au/radionational/programs/breakfast/wear-next-exhibition-whats-next-for-wearable-technology/6745986 TELEVISION COVERAGE http://www.abc.net.au/radionational/programs/breakfast/wear-next-exhibition-whats-next-for-wearable-technology/6745986 https://au.news.yahoo.com/video/watch/29439742/how-you-could-soon-be-wearing-smart-clothes/#page1
Resumo:
An experimental study to ascertain the ductile-to-brittle transition (DBT) in a bulk metallic glass (BMG) was conducted. Results of the impact toughness tests conducted at various temperatures on as-cast and structurally relaxed Zr-based BMG show a sharp DBT. The DBT temperature was found to be sensitive to the free-volume content in the alloy. Possible factors that result in the DBT were critically examined. It was found that the postulate of a critical free volume required for the amorphous alloy to exhibit good toughness cannot rationalize the experimental trends. Likewise, the Poisson's ratio-toughness correlations, which suggest a critical Poisson's ratio above which all glasses are tough, were found not to hold good. Viscoplasticity theories, developed using the concept of shear transformation zones and which describe the temperature and strain rate dependence of the crack-tip plasticity in BMGs, appear to be capable of capturing the essence of the experiments. Our results highlight the need for a more generalized theory to understand the origins of toughness in BMGs.