991 resultados para Quite
Resumo:
This paper seeks to assimilate Queer Theory: that is, to bring it within the gambit of a ‘mainstream’ or ‘dominant’ space: the academy. It does so by historicising Queer Theory, and investigating, if not what it is, then at least what it has been. This makes it possible to engage critically with Queer Theory. Suggesting that Queer Theory has often employed tropes of assimilation, the paper turns to another cultural site at which such language is popular - science fiction - in order to investigate the assumption of these metaphors. It goes on to suggest some of the assumptions about cultures which underlie these metaphors. Finally, it points to other sites in Queer Theory which undermines these assumptions, and provide other ways - quite uninterested in assimilation - in which to think Queer.
Resumo:
Histories of representation of Blackness are quite distinct in Australia and in America. Indigenous Australian identities have been consistently 'fatal', in Baudrillard's use of that term. So, while Black American representation includes intensely banal images of middle-class, materialistic individuals, such histories are largely absent in the Australian context. This implies that the few such representations which do occur — and particularly those of everyday game shows such as Sale of the Century and Family Feud — are particularly important for presenting a trivial, unexciting version of Aboriginality. This also clarifies the distinction between American and Australian versions of Blackness, and suggests that the latter set of representations might be more usefully viewed in relation to Native American rather than Black American images. The status of indigeneity might prove to be more relevant to Australian Aboriginal representation than the previously favoured identity of skin colour (Blackness).
Resumo:
In 2010 we realised that our fifth 48 hour game making competition was more than a mere event but that we were actually watching our local games industry enact a very intense process of community making and reflective practice. This presentation of our early research on the event was an invited spectacle at the Games Connect Australia Pacific Conference 2010 held at the Gold Coast Convention Centre, 14-15 October 2010. Abstract: Jams, Jellybeans and the fruit of passion: if games are about creating innovative player experience, then the place to start is how we set up game design education as an experience Authors: truna aka j.turner – Brisbane IGDA & Lubi Thomas, Mt Nebo Studios The 48 hour game making challenge has grown since it started in 2007 to accommodate 20 teams, some of the teams are professionals, some are made of people who are working in other industries, most are made of students from the various tertiary institutes around town. We still don’t quite understand why these mad people sign up for 48 hours of intense creativity just for the jelly beans we hand out as prizes but we do suspect that the space we give them and the passion they bring to the event offers those of us involved in the education side of the games industry really vital insights into what is critical and important in terms of education. The Australian games industry really needs these people, they are bright and ingenious and they make the most amazing games. But you won't get them by telling tertiary institutions what you fancy this year in terms of skills and criteria; you will get them by fostering their creativity and passion. If games are about creating innovative player experience, then the place to start is how we set up game design education as an experience.
Resumo:
A vast proportion of companies nowadays are looking to design and are focusing on the end users as a means of driving new projects. However still many companies are drawn to technological improvements which drive innovation within their industry context. The Australian livestock industry is no different. To date the adoption of new products and services within the livestock industry has been documented as being quite slow. This paper investigates how disruptive innovation should be a priority for these technologically focused companies and demonstrates how the use of design led innovation can bring about a higher quality engagement between end user and company alike. A case study linking participatory design and design thinking is presented. Within this, a conceptual model of presenting future scenarios to internal and external stakeholders is applied to the livestock industry; assisting companies to apply strategy, culture and advancement in meaningful product offerings to consumers.
Resumo:
A pervasive and puzzling feature of banks’ Value-at-Risk (VaR) is its abnormally high level, which leads to excessive regulatory capital. A possible explanation for the tendency of commercial banks to overstate their VaR is that they incompletely account for the diversification effect among broad risk categories (e.g., equity, interest rate, commodity, credit spread, and foreign exchange). By underestimating the diversification effect, bank’s proprietary VaR models produce overly prudent market risk assessments. In this paper, we examine empirically the validity of this hypothesis using actual VaR data from major US commercial banks. In contrast to the VaR diversification hypothesis, we find that US banks show no sign of systematic underestimation of the diversification effect. In particular, diversification effects used by banks is very close to (and quite often larger than) our empirical diversification estimates. A direct implication of this finding is that individual VaRs for each broad risk category, just like aggregate VaRs, are biased risk assessments.
Resumo:
Issues of equity and inequity have always been part of employment relations and are a fundamental part of the industrial landscape. For example, in most countries in the nineteenth century and a large part of the twentieth century women and members of ethnic groups (often a minority in the workforce) were barred from certain occupations, industries or work locations, and received less pay than the dominant male ethnic group for the same work. In recent decades attention has been focused on issues of equity between groups, predominantly women and different ethnic groups in the workforce. This has been embodied in industrial legislation, for example in equal pay for women and men, and frequently in specific equity legislation. In this way a whole new area of law and associated workplace practice has developed in many countries. Historically, employment relations and industrial relations research has not examined employment issues disaggregated by gender or ethnic group. Born out of concern with conflict and regulation at the workplace, studies tended to concentrate on white, male, unionized workers in manufacturing and heavy industry (Ackers, 2002, p. 4). The influential systems model crafted by Dunlop (1958) gave rise to The discipline’s preoccupation with the ‘problem of order’ [which] ensures the invisibility of women, not only because women have generally been less successful in mobilizing around their own needs and discontents, but more profoundly because this approach identifies the employment relationship as the ultimate source of power and conflict at work (Forrest, 1993, p. 410). While ‘the system approach does not deliberately exclude gender . . . by reproducing a very narrow research approach and understanding of issues of relevance for the research, gender is in general excluded or looked on as something of peripheral interest’ (Hansen, 2002, p. 198). However, long-lived patterns of gender segregation in occupations and industries, together with discriminatory access to work and social views about women and ethnic groups in the paid workforce, mean that the employment experience of women and ethnic groups is frequently quite different to that of men in the dominant ethnic group. Since the 1980s, research into women and employment has figured in the employment relations literature, but it is often relegated to a separate category in specific articles or book chapters, with women implicitly or explicitly seen as the atypical or exceptional worker (Hansen, 2002; Wajcman, 2000). The same conclusion can be reached for other groups with different labour force patterns and employment outcomes. This chapter proposes that awareness of equity issues is central to employment relations. Like industrial relations legislation and approaches, each country will have a unique set of equity policies and legislation, reflecting their history and culture. Yet while most books on employment and industrial relations deal with issues of equity in a separate chapter (most commonly on equity for women or more recently on ‘diversity’), the reality in the workplace is that all types of legislation and policies which impact on the wages and working conditions interact, and their impact cannot be disentangled one from another. When discussing equity in workplaces in the twenty-first century we are now faced with a plethora of different terms in English. Terms used include discrimination, equity, equal opportunity, affirmative action and diversity with all its variants (workplace diversity, managing diversity, and so on). There is a lack of agreed definitions, particularly when the terms are used outside of a legislative context. This ‘shifting linguistic terrain’ (Kennedy-Dubourdieu, 2006b, p. 3) varies from country to country and changes over time even within the one country. There is frequently a division made between equity and its related concepts and the range of expressions using the term ‘diversity’ (Wilson and Iles, 1999; Thomas and Ely, 1996). These present dilemmas for practitioners and researchers due to the amount and range of ideas prevalent – and the breadth of issues that are covered when we say ‘equity and diversity in employment’. To add to these dilemmas, the literature on equity and diversity has become bifurcated: the literature on workplace diversity/management diversity appears largely in the business literature while that on equity in employment appears frequently in legal and industrial relations journals. Workplaces of the twenty-first century differ from those of the nineteenth and twentieth century not only in the way they deal with individual and group differences but also in the way they interpret what are fair and equitable outcomes for different individuals and groups. These variations are the result of a range of social conditions, legislation and workplace constraints that have influenced the development of employment equity and the management of diversity. Attempts to achieve employment equity have primarily been dealt with through legislative means, and in the last fifty years this legislation has included elements of anti-discrimination, affirmative action, and equal employment opportunity in virtually all OECD countries (Mor Barak, 2005, pp. 17–52). Established on human rights and social justice principles, this legislation is based on the premise that systemic discrimination has and/or continues to exist in the labour force and particular groups of citizens have less advantageous employment outcomes. It is based on group identity, and employment equity programmes in general apply across all workplaces and are mandatory. The more recent notions of diversity in the workplace are based on ideas coming principally from the USA in the 1980s which have spread widely in the Western world since the 1990s. Broadly speaking, diversity ideas focus on individual differences either on their own or in concert with the idea of group differences. The diversity literature is based on a business case: that is diversity is profitable in a variety of ways for business, and generally lacks a social justice or human rights justification (Burgess et al., 2009, pp. 81–2). Managing diversity is represented at the organizational level as a voluntary and local programme. This chapter discusses some major models and theories for equity and diversity. It begins by charting the history of ideas about equity in employment and then briefly discusses what is meant by equality and equity. The chapter then analyses the major debates about the ways in which equity can be achieved. The more recent ideas about diversity are then discussed, including the history of these ideas and the principles which guide this concept. The following section discusses both major frameworks of equity and diversity. The chapter then raises some ways in which insights from the equity and diversity literature can inform employment relations. Finally, the future of equity and diversity ideas is discussed.
Resumo:
A bioassay technique, based on surface-enhanced Raman scattering (SERS) tagged gold nanoparticles encapsulated with a biotin functionalised polymer, has been demonstrated through the spectroscopic detection of a streptavidin binding event. A methodical series of steps preceded these results: synthesis of nanoparticles which were found to give a reproducible SERS signal; design and synthesis of polymers with RAFT-functional end groups able to encapsulate the gold nanoparticle. The polymer also enabled the attachment of a biotin molecule functionalised so that it could be attached to the hybrid nanoparticle through a modular process. Finally, the demonstrations of a positive bioassay for this model construct using streptavidin/biotin binding. The synthesis of silver and gold nanoparticles was performed by using tri-sodium citrate as the reducing agent. The shape of the silver nanoparticles was quite difficult to control. Gold nanoparticles were able to be prepared in more regular shapes (spherical) and therefore gave a more consistent and reproducible SERS signal. The synthesis of gold nanoparticles with a diameter of 30 nm was the most reproducible and these were also stable over the longest periods of time. From the SERS results the optimal size of gold nanoparticles was found to be approximately 30 nm. Obtaining a consistent SERS signal with nanoparticles smaller than this was particularly difficult. Nanoparticles more than 50 nm in diameter were too large to remain suspended for longer than a day or two and formed a precipitate, rendering the solutions useless for our desired application. Gold nanoparticles dispersed in water were able to be stabilised by the addition of as-synthesised polymers dissolved in a water miscible solvent. Polymer stabilised AuNPs could not be formed from polymers synthesised by conventional free radical polymerization, i.e. polymers that did not possess a sulphur containing end-group. This indicated that the sulphur-containing functionality present within the polymers was essential for the self assembly process to occur. Polymer stabilization of the gold colloid was evidenced by a range of techniques including, visible spectroscopy, transmission electron microscopy, Fourier transform infrared spectroscopy, thermogravimetric analysis and Raman spectroscopy. After treatment of the hybrid nanoparticles with a series of SERS tags, focussing on 2-quinolinethiol the SERS signals were found to have comparable signal intensity to the citrate stabilised gold nanoparticles. This finding illustrates that the stabilization process does not interfere with the ability of gold nanoparticles to act as substrates for the SERS effect. Incorporation of a biotin moiety into the hybrid nanoparticles was achieved through a =click‘ reaction between an alkyne-functionalised polymer and an azido-functionalised biotin analogue. This functionalized biotin was prepared through a 4-step synthesis from biotin. Upon exposure of the surface-bound streptavidin to biotin-functionalised polymer hybrid gold nanoparticles, then washing, a SERS signal was obtained from the 2-quinolinethiol which was attached to the gold nanoparticles (positive assay). After exposure to functionalised polymer hybrid gold nanoparticles without biotin present then washing a SERS signal was not obtained as the nanoparticles did not bind to the streptavidin (negative assay). These results illustrate the applicability of the use of SERS active functional-polymer encapsulated gold nanoparticles for bioassay application.
Resumo:
This is a deliberately contentious paper about the future of the socio-political sphere in the West based on what we know about its past. I argue that the predominant public discourse in Western countries is best characterised as one of selective forgetfulness; a semi-blissful, amnesiacal state of collective dementia that manifests itself in symbolic idealism: informationalism. Informationalism is merely the latest form of idealism. It is a lot like religion insofar as it causally relates abstract concepts with reality and, consequently, becomes confused between the two. Historically, this has proven to be a dangerous state of affairs, especially when elites becomes confused between ideas about how a society should work, and the way it actually does work. Central to the idealism of the information age, at least in intellectual spheres, is the so called "problem of the subject". I argue that the "problem of the subject" is a largely synthetic, destabilising, and ultimately fruitless theoretical abstraction which turns on a synthetically derived, generalised intradiscursive space; existentialist nihilism; and the theoretical baubles of ontological metaphysics. These philosophical aberrations are, in turn, historically concomitant with especially destructive political and social configurations. This paper sketches a theoretical framework for identity formation which rejects the problem of the subject, and proposes potential resources, sources, and strategies with which to engage the idealism that underpins this obfuscating problematic in an age of turbulent social uncertainty. Quite simply, I turn to history as the source of human identity. While informationalism, like religion, is mostly focused on utopian futures, I assert that history, not the future, holds the solutions for substantive problematics concerning individual and social identities. I argue here that history, language, thought, and identity are indissolubly entangled and so should be understood as such: they are the fundamental parts of 'identities in action'. From this perspective, the ‘problem of the subject’ becomes less a substantive intellectual problematic and more a theoretical red herring.
Resumo:
In fault detection and diagnostics, limitations coming from the sensor network architecture are one of the main challenges in evaluating a system’s health status. Usually the design of the sensor network architecture is not solely based on diagnostic purposes, other factors like controls, financial constraints, and practical limitations are also involved. As a result, it quite common to have one sensor (or one set of sensors) monitoring the behaviour of two or more components. This can significantly extend the complexity of diagnostic problems. In this paper a systematic approach is presented to deal with such complexities. It is shown how the problem can be formulated as a Bayesian network based diagnostic mechanism with latent variables. The developed approach is also applied to the problem of fault diagnosis in HVAC systems, an application area with considerable modeling and measurement constraints.
Resumo:
The practice of robotics and computer vision each involve the application of computational algorithms to data. The research community has developed a very large body of algorithms but for a newcomer to the field this can be quite daunting. For more than 10 years the author has maintained two open-source MATLAB® Toolboxes, one for robotics and one for vision. They provide implementations of many important algorithms and allow users to work with real problems, not just trivial examples. This new book makes the fundamental algorithms of robotics, vision and control accessible to all. It weaves together theory, algorithms and examples in a narrative that covers robotics and computer vision separately and together. Using the latest versions of the Toolboxes the author shows how complex problems can be decomposed and solved using just a few simple lines of code. The topics covered are guided by real problems observed by the author over many years as a practitioner of both robotics and computer vision. It is written in a light but informative style, it is easy to read and absorb, and includes over 1000 MATLAB® and Simulink® examples and figures. The book is a real walk through the fundamentals of mobile robots, navigation, localization, arm-robot kinematics, dynamics and joint level control, then camera models, image processing, feature extraction and multi-view geometry, and finally bringing it all together with an extensive discussion of visual servo systems.
Resumo:
Recently, user tagging systems have grown in popularity on the web. The tagging process is quite simple for ordinary users, which contributes to its popularity. However, free vocabulary has lack of standardization and semantic ambiguity. It is possible to capture the semantics from user tagging and represent those in a form of ontology, but the application of the learned ontology for recommendation making has not been that flourishing. In this paper we discuss our approach to learn domain ontology from user tagging information and apply the extracted tag ontology in a pilot tag recommendation experiment. The initial result shows that by using the tag ontology to re-rank the recommended tags, the accuracy of the tag recommendation can be improved.
Resumo:
Recently, user tagging systems have grown in popularity on the web. The tagging process is quite simple for ordinary users, which contributes to its popularity. However, free vocabulary has lack of standardization and semantic ambiguity. It is possible to capture the semantics from user tagging into some form of ontology, but the application of the resulted ontology for recommendation making has not been that flourishing. In this paper we discuss our approach to learn domain ontology from user tagging information and apply the extracted tag ontology in a pilot tag recommendation experiment. The initial result shows that by using the tag ontology to re-rank the recommended tags, the accuracy of the tag recommendation can be improved.
Resumo:
This ALTC Teaching Fellowship aimed to establish Guiding Principles for Library and Information Science Education 2.0. The aim was achieved by (i) identifying the current and anticipated skills and knowledge required by successful library and information science (LIS) professionals in the age of web 2.0 (and beyond), (ii) establishing the current state of LIS education in Australia in supporting the development of librarian 2.0, and in doing so, identify models of best practice.
The fellowship has contributed to curriculum renewal in the LIS profession. It has helped to ensure that LIS education in Australia continues to meet the changing skills and knowledge requirements of the profession it supports. It has also provided a vehicle through which LIS professionals and LIS educators may find opportunities for greater collaboration and more open communication. This will help bridge the gap between LIS theory and practice and will foster more authentic engagement between LIS education and other parts of the LIS industry in the education of the next generation of professionals. Through this fellowship the LIS discipline has become a role model for other disciplines who will be facing similar issues in the coming years.
Eighty-one members of the Australian LIS profession participated in a series of focus groups exploring the current and anticipated skills and knowledge needed by the LIS professional in the web 2.0 world and beyond. Whilst each focus group tended to draw on specific themes of interest to that particular group of people, there was a great deal of common ground. Eight key themes emerged: technology, learning and education, research or evidence-based practice, communication, collaboration and team work, user focus, business savvy and personal traits.
It was acknowledged that the need for successful LIS professionals to possess transferable skills and interpersonal attributes was not new. It was noted however that the speed with which things are changing in the web 2.0 world was having a significant impact and that this faster pace is placing a new and unexpected emphasis on the transferable skills and knowledge. It was also acknowledged that all librarians need to possess these skills, knowledge and attributes and not just the one or two role models who lead the way.
The most interesting finding however was that web 2.0, library 2.0 and librarian 2.0 represented a ‘watershed’ for the LIS profession. Almost all the focus groups spoke about how they are seeing and experiencing a culture change in the profession. Librarian 2.0 requires a ‘different mindset or attitude’. The Levels of Perspective model by Daniel Kim provides one lens by which to view this finding. The focus group findings suggest that we are witnessing a re-awaking of the Australian LIS profession as it begins to move towards the higher levels of Kim’s model (ie mental models, vision).
Thirty-six LIS educators participated in telephone interviews aimed at exploring the current state of LIS education in supporting the development of librarian 2.0. Skills and knowledge of LIS professionals in a web 2.0 world that were identified and discussed by the LIS educators mirrored those highlighted in the focus group discussions with LIS professionals. Similarly it was noted that librarian 2.0 needed a focus less on skills and knowledge and more on attitude. However, whilst LIS professionals felt that there was a paradigm shift within the profession. LIS educators did not speak with one voice on this matter with quite a number of the educators suggesting that this might be ‘overstating it a bit’. This study provides evidence for “disparate viewpoints” (Hallam, 2007) between LIS educators and LIS professionals that can have a significant implications for the future of not just LIS professional education specifically but for the profession generally.
Library and information science education 2.0: guiding principles and models of best practice 1
Inviting the LIS academics to discuss how their teaching and learning activities support the development of librarian 2.0 was a core part of the interviews conducted. The strategies used and the challenges faced by LIS educators in developing their teaching and learning approaches to support the formation of librarian 2.0 are identified and discussed. A core part of the fellowship was the identification of best practice examples on how LIS educators were developing librarian 2.0. Twelve best practice examples were identified. Each educator was recorded discussing his or her approach to teaching and learning. Videos of these interviews are available via the Fellowship blog at
Resumo:
Background: This study aimed to determine whether subjective dimensions of recovery such as empowerment are associated with self-report of more objective indicators such as level of participation in the community and income from employment. A secondary aim was to investigate the extent to which diagnosis or other consumer characteristics mediated any relationship between these variables. Methods: The Community Integration Measure, the Empowerment Scale, the Recovery Assessment Scale, and the Camberwell Assessment of Needs Short Appraisal Schedule were administered to a convenience sample of 161 consumers with severe mental illness. Results: The majority of participants had a primary diagnosis of schizophreniform, anxiety/depression or bipolar affective disorder. The Empowerment Scale was quite strongly correlated with the Recovery Assessment Scale and the Community Integration Measure. Participants with a diagnosis of bipolar affective disorder had signifi cantly higher recovery and empowerment scores than participants with schizophrenia or depression. Both empowerment and recovery scores were significantly higher for people engaged in paid employment than for those receiving social security benefits. Conclusions: The measurement of subjective dimensions of recovery such as empowerment has validity in evaluation of global recovery for people with severe mental illness. A diagnosis of bipolar disorder is associated with higher scores on subjective and objective indicators of recovery.
Resumo:
The Queensland University of Technology (QUT) allows the presentation of a thesis for the Degree of Doctor of Philosophy in the format of published or submitted papers, where such papers have been published, accepted or submitted during the period of candidature. This thesis is composed of Seven published/submitted papers and one poster presentation, of which five have been published and the other two are under review. This project is financially supported by the QUTPRA Grant. The twenty-first century started with the resurrection of lignocellulosic biomass as a potential substitute for petrochemicals. Petrochemicals, which enjoyed the sustainable economic growth during the past century, have begun to reach or have reached their peak. The world energy situation is complicated by political uncertainty and by the environmental impact associated with petrochemical import and usage. In particular, greenhouse gasses and toxic emissions produced by petrochemicals have been implicated as a significant cause of climate changes. Lignocellulosic biomass (e.g. sugarcane biomass and bagasse), which potentially enjoys a more abundant, widely distributed, and cost-effective resource base, can play an indispensible role in the paradigm transition from fossil-based to carbohydrate-based economy. Poly(3-hydroxybutyrate), PHB has attracted much commercial interest as a plastic and biodegradable material because some its physical properties are similar to those of polypropylene (PP), even though the two polymers have quite different chemical structures. PHB exhibits a high degree of crystallinity, has a high melting point of approximately 180°C, and most importantly, unlike PP, PHB is rapidly biodegradable. Two major factors which currently inhibit the widespread use of PHB are its high cost and poor mechanical properties. The production costs of PHB are significantly higher than for plastics produced from petrochemical resources (e.g. PP costs $US1 kg-1, whereas PHB costs $US8 kg-1), and its stiff and brittle nature makes processing difficult and impedes its ability to handle high impact. Lignin, together with cellulose and hemicellulose, are the three main components of every lignocellulosic biomass. It is a natural polymer occurring in the plant cell wall. Lignin, after cellulose, is the most abundant polymer in nature. It is extracted mainly as a by-product in the pulp and paper industry. Although, traditionally lignin is burnt in industry for energy, it has a lot of value-add properties. Lignin, which to date has not been exploited, is an amorphous polymer with hydrophobic behaviour. These make it a good candidate for blending with PHB and technically, blending can be a viable solution for price and reduction and enhance production properties. Theoretically, lignin and PHB affect the physiochemical properties of each other when they become miscible in a composite. A comprehensive study on structural, thermal, rheological and environmental properties of lignin/PHB blends together with neat lignin and PHB is the targeted scope of this thesis. An introduction to this research, including a description of the research problem, a literature review and an account of the research progress linking the research papers is presented in Chapter 1. In this research, lignin was obtained from bagasse through extraction with sodium hydroxide. A novel two-step pH precipitation procedure was used to recover soda lignin with the purity of 96.3 wt% from the black liquor (i.e. the spent sodium hydroxide solution). The precipitation process is presented in Chapter 2. A sequential solvent extraction process was used to fractionate the soda lignin into three fractions. These fractions, together with the soda lignin, were characterised to determine elemental composition, purity, carbohydrate content, molecular weight, and functional group content. The thermal properties of the lignins were also determined. The results are presented and discussed in Chapter 2. On the basis of the type and quantity of functional groups, attempts were made to identify potential applications for each of the individual lignins. As an addendum to the general section on the development of composite materials of lignin, which includes Chapters 1 and 2, studies on the kinetics of bagasse thermal degradation are presented in Appendix 1. The work showed that distinct stages of mass losses depend on residual sucrose. As the development of value-added products from lignin will improve the economics of cellulosic ethanol, a review on lignin applications, which included lignin/PHB composites, is presented in Appendix 2. Chapters 3, 4 and 5 are dedicated to investigations of the properties of soda lignin/PHB composites. Chapter 3 reports on the thermal stability and miscibility of the blends. Although the addition of soda lignin shifts the onset of PHB decomposition to lower temperatures, the lignin/PHB blends are thermally more stable over a wider temperature range. The results from the thermal study also indicated that blends containing up to 40 wt% soda lignin were miscible. The Tg data for these blends fitted nicely to the Gordon-Taylor and Kwei models. Fourier transform infrared spectroscopy (FT-IR) evaluation showed that the miscibility of the blends was because of specific hydrogen bonding (and similar interactions) between reactive phenolic hydroxyl groups of lignin and the carbonyl group of PHB. The thermophysical and rheological properties of soda lignin/PHB blends are presented in Chapter 4. In this chapter, the kinetics of thermal degradation of the blends is studied using thermogravimetric analysis (TGA). This preliminary investigation is limited to the processing temperature of blend manufacturing. Of significance in the study, is the drop in the apparent energy of activation, Ea from 112 kJmol-1 for pure PHB to half that value for blends. This means that the addition of lignin to PHB reduces the thermal stability of PHB, and that the comparative reduced weight loss observed in the TGA data is associated with the slower rate of lignin degradation in the composite. The Tg of PHB, as well as its melting temperature, melting enthalpy, crystallinity and melting point decrease with increase in lignin content. Results from the rheological investigation showed that at low lignin content (.30 wt%), lignin acts as a plasticiser for PHB, while at high lignin content it acts as a filler. Chapter 5 is dedicated to the environmental study of soda lignin/PHB blends. The biodegradability of lignin/PHB blends is compared to that of PHB using the standard soil burial test. To obtain acceptable biodegradation data, samples were buried for 12 months under controlled conditions. Gravimetric analysis, TGA, optical microscopy, scanning electron microscopy (SEM), differential scanning calorimetry (DSC), FT-IR, and X-ray photoelectron spectroscopy (XPS) were used in the study. The results clearly demonstrated that lignin retards the biodegradation of PHB, and that the miscible blends were more resistant to degradation compared to the immiscible blends. To obtain an understanding between the structure of lignin and the properties of the blends, a methanol-soluble lignin, which contains 3× less phenolic hydroxyl group that its parent soda lignin used in preparing blends for the work reported in Chapters 3 and 4, was blended with PHB and the properties of the blends investigated. The results are reported in Chapter 6. At up to 40 wt% methanolsoluble lignin, the experimental data fitted the Gordon-Taylor and Kwei models, similar to the results obtained soda lignin-based blends. However, the values obtained for the interactive parameters for the methanol-soluble lignin blends were slightly lower than the blends obtained with soda lignin indicating weaker association between methanol-soluble lignin and PHB. FT-IR data confirmed that hydrogen bonding is the main interactive force between the reactive functional groups of lignin and the carbonyl group of PHB. In summary, the structural differences existing between the two lignins did not manifest itself in the properties of their blends.