847 resultados para Cleaning and dyeing industry
Resumo:
The last two decades have witnessed a fragmentation of previously integrated systems of production and service delivery with the advent of boundary-less, networked and porous organisational forms. This trend has been associated with the growth of outsourcing and increased use of contingent workers. One consequence of these changes is the development of production/service delivery systems based on complex national and international networks of multi-tiered subcontracting increasingly labelled as supply chains. A growing body of research indicates that subcontracting and contingent work arrangements affect design and decision-making processes in ways that can seriously undermine occupational health and safety (OHS). Elaborate supply chains also present a regulatory challenge because legal responsibility for OHS is diffused amongst a wider array of parties, targeting key decision-makers is more difficult, and government agencies encounter greater logistical difficulties trying to safeguard contingent workers. In a number of industries these problems have prompted new forms of regulatory intervention, including mechanisms for sheeting legal responsibility to the top of supply chains, contractual tracking devices and increasing industry, union and community involvement in enforcement. After describing the problems just alluded to this paper examines recent efforts to regulate supply chains to safeguard OHS in the United Kingdom and Australia.
Resumo:
Housing affordability and sustainable development are not polarised ideologies as both are necessary with increasing urbanisation. We must bridge the gap between current median house pricing and target affordable house pricing whilst pursuing sustainability. This paper examines the potential of initial construction cost and ongoing utilities and transport cost reduction through the integration of sustainable housing design and transit oriented development principles in a Commuter Energy and Building Utilities System (CEBUS). It also introduces current research on the development of a Dynamic Simulation Model for CEBUS applications in the Australian property development and construction industry.
Resumo:
The upstream oil and gas industry has been contending with massive data sets and monolithic files for many years, but “Big Data” is a relatively new concept that has the potential to significantly re-shape the industry. Despite the impressive amount of value that is being realized by Big Data technologies in other parts of the marketplace, however, much of the data collected within the oil and gas sector tends to be discarded, ignored, or analyzed in a very cursory way. This viewpoint examines existing data management practices in the upstream oil and gas industry, and compares them to practices and philosophies that have emerged in organizations that are leading the way in Big Data. The comparison shows that, in companies that are widely considered to be leaders in Big Data analytics, data is regarded as a valuable asset—but this is usually not true within the oil and gas industry insofar as data is frequently regarded there as descriptive information about a physical asset rather than something that is valuable in and of itself. The paper then discusses how the industry could potentially extract more value from data, and concludes with a series of policy-related questions to this end.
Resumo:
There is an ongoing debate about the reasons for and factors contributing to healthcare-associated infection (HAI). Different solutions have been proposed over time to control the spread of HAI, with more focus on hand hygiene than on other aspects such as preventing the aerial dissemination of bacteria. Yet, it emerges that there is a need for a more pluralistic approach to infection control; one that reflects the complexity of the systems associated with HAI and involves multidisciplinary teams including hospital doctors, infection control nurses, microbiologists, architects, and engineers with expertise in building design and facilities management. This study reviews the knowledge base on the role that environmental contamination plays in the transmission of HAI, with the aim of raising awareness regarding infection control issues that are frequently overlooked. From the discussion presented in the study, it is clear that many unknowns persist regarding aerial dissemination of bacteria, and its control via cleaning and disinfection of the clinical environment. There is a paucity of good-quality epidemiological data, making it difficult for healthcare authorities to develop evidence-based policies. Consequently, there is a strong need for carefully designed studies to determine the impact of environmental contamination on the spread of HAI.
Resumo:
A quantitative understanding of outdoor air quality in school environments is crucial given that air pollution levels inside classrooms are significantly influenced by outdoor pollution sources. To date, only a handful of studies have been conducted on this important topic in developing countries. The aim of this study was to quantify pollutant levels in the outdoor environment of a school in Bhutan and assess the factors driving them. Measurements were conducted for 16 weeks, spanning the wet and dry seasons, in a rural school in Bhutan. PM10, PM2.5, particle number (PN) and CO were measured daily using real-time instruments, while weekly samples for volatile organic compounds (VOCs), carbonyls and NO2 were collected using a passive sampling method. Overall mean PM10 and PM2.5 concentrations (µg/m3) were 27 and 13 for the wet, and 36 and 29 for the dry season, respectively. Only wet season data were available for PN concentrations, with a mean of 2.56 × 103 particles/cm3. Mean CO concentrations were below the detection limit of the instrumentation for the entire measurement period. Only low levels of eight VOCs were detected in both the wet and dry seasons, which presented different seasonal patterns in terms of the concentration of different compounds. The notable carbonyls were formaldehyde and hexaldehyde, with mean concentrations (µg/m3) of 2.37 and 2.41 for the wet, and 6.22 and 0.34 for the dry season, respectively. Mean NO2 cocentration for the dry season was 1.7 µg/m3, while it was below the detection limit of the instrumentation for the wet season. The pollutant concentrations were associated with a number of factors, such as cleaning and combustion activities in and around the school. A comparison with other school studies showed comparable results with a few of the studies, but in general, we found lower pollutant concentrations in the present study.
Resumo:
Purpose: While the global education debate remains focused on graduate skills and employability, the absence of a shared language between student, academic and industry stakeholder groups means that defining industry skills requirements is both essential and difficult. The aim of this study was to assess graduate skills requirements in a knowledge intensive industry from a demand perspective as distinct from a curriculum (supply) viewpoint. Design/methodology/approach: Skills items were derived from a breadth of disciplines across academic, policy and industry literature. CEOs and senior managers in the innovation and commercialisation industry were surveyed regarding perceptions of skills in graduates and skills in demand by the firm. Two rounds of exploratory factor analyses were undertaken to examine employers’ perceptions of the skills gap. Findings: First order analysis resolved 10 broad constructs that represent cognitive, interpersonal and intrapersonal skills domains as applied in this industry. Knowledge, leadership and interprofessional collaboration feature as prominent skills. Second order analysis revealed employers’ perceptions of graduate skills specifically centre on organisational fit and organisational success. An over-arching theme relates to performance of the individual in organisations. Research limitations/implications: Our findings suggest that the discourse on employability and the design of curriculum need to shift from instilling lists of skills towards enabling graduates to perform in a diversity of workplace contexts and expectations centred on organisational purpose. Originality/value: In contrast to the heterogeneous nature of industry surveys, we targeted a homogenous sector that is representative of knowledge intensive industries. This study contributes to the broader stakeholder dialogue of the value and application of graduate skills in this and other industry sectors.
Resumo:
2,4,6-trinitrotoluene (TNT) is one of the most commonly used nitro aromatic explosives in landmine, military and mining industry. This article demonstrates rapid and selective identification of TNT by surface-enhanced Raman spectroscopy (SERS) using 6-aminohexanethiol (AHT) as a new recognition molecule. First, Meisenheimer complex formation between AHT and TNT is confirmed by the development of pink colour and appearance of new band around 500 nm in UV-visible spectrum. Solution Raman spectroscopy study also supported the AHT:TNT complex formation by demonstrating changes in the vibrational stretching of AHT molecule between 2800-3000 cm−1. For surface enhanced Raman spectroscopy analysis, a self-assembled monolayer (SAM) of AHT is formed over the gold nanostructure (AuNS) SERS substrate in order to selectively capture TNT onto the surface. Electrochemical desorption and X-ray photoelectron studies are performed over AHT SAM modified surface to examine the presence of free amine groups with appropriate orientation for complex formation. Further, AHT and butanethiol (BT) mixed monolayer system is explored to improve the AHT:TNT complex formation efficiency. Using a 9:1 AHT:BT mixed monolayer, a very low detection limit (LOD) of 100 fM TNT was realized. The new method delivers high selectivity towards TNT over 2,4 DNT and picric acid. Finally, real sample analysis is demonstrated by the extraction and SERS detection of 302 pM of TNT from spiked.
Resumo:
Erythropoietin (EPO), a glycoprotein hormone of ∼34 kDa, is an important hematopoietic growth factor, mainly produced in the kidney and controls the number of red blood cells circulating in the blood stream. Sensitive and rapid recombinant human EPO (rHuEPO) detection tools that improve on the current laborious EPO detection techniques are in high demand for both clinical and sports industry. A sensitive aptamer-functionalized biosensor (aptasensor) has been developed by controlled growth of gold nanostructures (AuNS) over a gold substrate (pAu/AuNS). The aptasensor selectively binds to rHuEPO and, therefore, was used to extract and detect the drug from horse plasma by surface enhanced Raman spectroscopy (SERS). Due to the nanogap separation between the nanostructures, the high population and distribution of hot spots on the pAu/AuNS substrate surface, strong signal enhancement was acquired. By using wide area illumination (WAI) setting for the Raman detection, a low RSD of 4.92% over 150 SERS measurements was achieved. The significant reproducibility of the new biosensor addresses the serious problem of SERS signal inconsistency that hampers the use of the technique in the field. The WAI setting is compatible with handheld Raman devices. Therefore, the new aptasensor can be used for the selective extraction of rHuEPO from biological fluids and subsequently screened with handheld Raman spectrometer for SERS based in-field protein detection.
Resumo:
In the United States, there has been fierce debate over state, federal and international efforts to engage in genetically modified food labelling (GM food labelling). A grassroots coalition of consumers, environmentalists, organic farmers, and the food movement has pushed for law reform in respect of GM food labelling. The Just Label It campaign has encouraged United States consumers to send comments to the United States Food and Drug Administration to label genetically modified foods. This Chapter explores the various justifications made in respect of genetically modified food labelling. There has been a considerable effort to portray the issue of GM food labelling as one of consumer rights as part of ‘the right to know’. There has been a significant battle amongst farmers over GM food labelling – with organic farmers and biotechnology companies, fighting for precedence. There has also been a significant discussion about the use of GM food labelling as a form of environmental legislation. The prescriptions in GM food labelling regulations may serve to promote eco-labelling, and deter greenwashing. There has been a significant debate over whether GM food labelling may serve to regulate corporations – particularly from the food, agriculture, and biotechnology industries. There are significant issues about the interaction between intellectual property laws – particularly in respect of trade mark law and consumer protection – and regulatory proposals focused upon biotechnology. There has been a lack of international harmonization in respect of GM food labelling. As such, there has been a major use of comparative arguments about regulator models in respect of food labelling. There has also been a discussion about international law, particularly with the emergence of sweeping regional trade proposals, such as the Trans-Pacific Partnership, and the Trans-Atlantic Trade and Investment Partnership. This Chapter considers the United States debates over genetically modified food labelling – at state, federal, and international levels. The battles often involved the use of citizen-initiated referenda. The policy conflicts have been policy-centric disputes – pitting organic farmers, consumers, and environmentalists against the food industry and biotechnology industry. Such battles have raised questions about consumer rights, public health, freedom of speech, and corporate rights. The disputes highlighted larger issues about lobbying, fund-raising, and political influence. The role of money in United States has been a prominent concern of Lawrence Lessig in his recent academic and policy work with the group, Rootstrikers. Part 1 considers the debate in California over Proposition 37. Part 2 explores other key state initiatives in respect of GM food labelling. Part 3 examines the Federal debate in the United States over GM food labelling. Part 4 explores whether regional trade agreements – such as the Trans-Pacific Partnership (TPP) and the Trans-Atlantic Trade and Investment Partnership (TTIP) – will impact upon
Resumo:
Industry clockspeed has been used in earlier literature to assess the rate of change of industries but this measure remains limited in its application in longitudinal analyses as well as in systemic industry contexts. Nevertheless, there is a growing need for such a measure as business ecosystems replace standalone products and organisations are required to manage their innovation process in increasingly systemic contexts. In this paper, we firstly derive a temporal measure of technological industry clockspeed, which evaluates the time between successively higher levels of performance in the industry's product technology, over time. We secondly derive a systemic technological industry clockspeed for systemic industry contexts, which measures the time required for a particular sub-industry to utilise the level of technological performance that is provisioned by another, interdependent sub-industry. In turn, we illustrate the use of these measures in an empirical study of the systemic personal computer industry. The results of our empirical illustration show that the proposed clockspeeds together provide informative measures of the pace of change for sub-industries and systemic industry. We subsequently discuss the organisational considerations and theoretical implications of the proposed measures.
Resumo:
In his 1987 book, The Media Lab: Inventing the Future at MIT, Stewart Brand provides an insight into the visions of the future of the media in the 1970s and 1980s. 1 He notes that Nicolas Negroponte made a compelling case for the foundation of a media laboratory at MIT with diagrams detailing the convergence of three sectors of the media—the broadcast and motion picture industry; the print and publishing industry; and the computer industry. Stewart Brand commented: ‘If Negroponte was right and communications technologies really are converging, you would look for signs that technological homogenisation was dissolving old boundaries out of existence, and you would expect an explosion of new media where those boundaries used to be’. Two decades later, technology developers, media analysts and lawyers have become excited about the latest phase of media convergence. In 2006, the faddish Time Magazine heralded the arrival of various Web 2.0 social networking services: You can learn more about how Americans live just by looking at the backgrounds of YouTube videos—those rumpled bedrooms and toy‐strewn basement rec rooms—than you could from 1,000 hours of network television. And we didn’t just watch, we also worked. Like crazy. We made Facebook profiles and Second Life avatars and reviewed books at Amazon and recorded podcasts. We blogged about our candidates losing and wrote songs about getting dumped. We camcordered bombing runs and built open‐source software. America loves its solitary geniuses—its Einsteins, its Edisons, its Jobses—but those lonely dreamers may have to learn to play with others. Car companies are running open design contests. Reuters is carrying blog postings alongside its regular news feed. Microsoft is working overtime to fend off user‐created Linux. We’re looking at an explosion of productivity and innovation, and it’s just getting started, as millions of minds that would otherwise have drowned in obscurity get backhauled into the global intellectual economy. The magazine announced that Time’s Person of the Year was ‘You’, the everyman and everywoman consumer ‘for seizing the reins of the global media, for founding and framing the new digital democracy, for working for nothing and beating the pros at their own game’. This review essay considers three recent books, which have explored the legal dimensions of new media. In contrast to the unbridled exuberance of Time Magazine, this series of legal works displays an anxious trepidation about the legal ramifications associated with the rise of social networking services. In his tour de force, The Future of Reputation: Gossip, Rumor, and Privacy on the Internet, Daniel Solove considers the implications of social networking services, such as Facebook and YouTube, for the legal protection of reputation under privacy law and defamation law. Andrew Kenyon’s edited collection, TV Futures: Digital Television Policy in Australia, explores the intersection between media law and copyright law in the regulation of digital television and Internet videos. In The Future of the Internet and How to Stop It, Jonathan Zittrain explores the impact of ‘generative’ technologies and ‘tethered applications’—considering everything from the Apple Mac and the iPhone to the One Laptop per Child programme.
Resumo:
Supply chain outsourcing has posed problems for conventional labour regulation, which focuses on employers contracting directly with workers, particularly employees. These difficulties have been exacerbated by the traditional trifurcated approach to regulation of pay and conditions, work health and safety and workers’ compensation. This paper analyses the parallel interaction of two legal developments within the Australian textile, clothing and footwear industry. The first is mandatory contractual tracking mechanisms within state and federal labour laws and the second is the duties imposed by the harmonised Work Health and Safety Acts. Their combined effect has created an innovative, fully enforceable and integrated regulatory framework for the textile, clothing and footwear industry and, it is argued, other supply chains in different industry contexts. This paper highlights how regulatory solutions can address adverse issues for workers at the bottom of contractual networks, such as fissured workplaces and capital fragmentation, by enabling regulators to harness the commercial power of business controllers at the apex to ensure compliance throughout the entire chain.
Resumo:
Supermarkets in Australia may have substantial market power as buyers in wholesale markets for grocery products. They may also have substantial bargaining power in negotiating contracts with their suppliers of grocery products. The Competition and Consumer Act 2010 (Cth) (CCA) regulates misconduct by supermarkets as customer/acquirers in three ways. First, s 46(1) of the CCA prohibits the ‘taking advantage’ of buyer power for the purpose of damaging a competitor, preventing entry or deterring or preventing competitive conduct. Secondly, s 21 of the ACL prohibits unconscionable conduct in business–to–business transactions. Thirdly, Pt IVB of the CCA provides for the promulgation of mandatory and voluntary industry codes of conduct. Since 1 July 2015 the conduct of supermarkets as customer/acquirers has been regulated by the Food and Grocery Industry Code of Conduct. This article examines these three different approaches. It considers them against the background of the misconduct at issue in ACCC v Coles Supermarkets Australia Pty Ltd which the ACCC chose to litigate as an unconscionable conduct case, rather than a misuse of market power case. The article also considers the strengths and weaknesses of each of the three approaches and concludes that while the three approaches address different problems there is scope for overlap and all three should be retained for compete coverage.
Resumo:
This paper outlines the expectations of a wide range of stakeholders for environmental assurance in the pastoral industries and agriculture generally. Stakeholders consulted were domestic consumers, rangeland graziers, members of environmental groups, companies within meat and wool supply chains, and agricultural industry, environmental and consumer groups. Most stakeholders were in favour of the application of environmental assurance to agriculture, although supply chains and consumers had less enthusiasm for this than environmental and consumer groups. General public good benefits were more important to environmental and consumer groups, while private benefits were more important to consumers and supply chains. The 'ideal' form of environmental assurance appears to be a management system that provides for continuous improvement in environmental, quality and food safety outcomes, combined with elements of ISO 14024 eco-labelling such as life-cycle assessment, environmental performance criteria, third-party certification, labelling and multi-stakeholder involvement. However, market failure prevents this from being implemented and will continue to do so for the foreseeable future. In the short term, members of supply chains (the people that must implement and fund environmental assurance) want this to be kept simple and low cost, to be built into their existing industry standards and to add value to their businesses. As a starting point, several agricultural industry organisations favour the use of a basic management system, combining continuous improvement, risk assessment and industry best management practice programs, which can be built on over time to meet regulator, market and community expectations.
Resumo:
The aim of this project was to quantify differences between treated and untreated coir (coconut industrial residues) products and to identify differences in growth, yield and quality of cut flowers grown in different coir products. This has been brought about largely by the concern that some coir products, washed in low quality (saline) water may have detrimental effects on plant productivity and quality. There is concern in the flower production industry and among media suppliers, that lower quality products are favoured due to price alone, which as this project shows is a false economy. Specifically the project examined: • Differences in physical and chemical properties of treated and untreated coir along with another commonly used growing media in the flower industy; • Potential improvements in yield and quality of Gerbera (Gerbera jamesonii); • Potential differences in vase life of Gerbera as a result of the different growing media; and • Cost-benefit implications of treated (more expensive) coir substrate products versus untreated (less expensive) coir including any subsequent differences in yield and quality. By first examining the physical and some chemical properties of different coir substrates and other industry standard media, the researchers have been able to validate the concerns raised about the potential quality issues in coir based growing media. There was a great deal of variation in both the electrical conductivity and sodium contents. Physical properties were also variable as expected since manufacturers are able to target the specific physical preferences of plants through manipulation of the particle size distribution. A field trial was conducted under protected cropping practices in which three growing media were compared in terms of total productivity and also flower quality parameters such as stem length, flower diameter and vase life. The trial was a completely randomised design with the three growing media comprising treated coir discs, untreated coir discs and a pine bark coir mix. Four cultivars of Gerbera were assessed: Balance®; Carambole®; Dune® and Picobello®, all new products from Florist de Kwakel B.V., Denmark. Initial expansion from tissue culture was conducted at the Highsun Express Facility, Ormiston, Queensland. The trial included 12 replications of each cultivar in each media (a total of 144 plants) to ensure all data collected, and the derived conclusions were statistically rigorous. The coir supplied with no pre-treatment or buffering produced significantly less flowers than those grown in a pine bark coir mix or the pre-treated coir. Interestingly, the pine bark coir mix produced a greater number of flowers. However, the flowers produced in the pine bark coir mix were generally a shorter length stem. Productivity data, combined with flower quality data and component costs were all analysed through a cost/benefit economic model which showed that the greater revenue from better stem length outweighed the stem numbers, giving a cost benefit ratio of 2.58 for treated coir, 2.49 for untreated coir and 2.52 for pine bark coir mix. While this does not seem a large difference, when considering the number of plants a producer maintains can be upwards of 50,000 the difference in revenue would be, at a minimum $60,000 in this example. In conclusion, this project has found that there are significant effects on plant health, growth, yield and quality between those grown in treated and untreated coir. The outcome being growers can confidently invest in more expensive treated products with the assurance that benefits will outweigh initial cost. It is false economy to favour untreated coir products based on price alone. Producers should ensure they fully understand the production processes when purchasing growing media. Rather than targeting lower priced materials, it is recommended that quality be the highest priority in making this management decision. In making recommendations for future research and development it was important to consider conclusions from other researchers as well as those of the current project. It has been suggested that the media has greater longevity, which although not captured in this study could also lead to further cost efficiencies. Assessment of the products over a longer time period, and using a wider range of plant species are the major recommendations for further research to ensure greater understanding as to the importance in choosing the right growing media to meet specific needs.