948 resultados para Copenhagen (Denmark). Universitet.
Resumo:
While landscape photography’s complicity with the colonial possession of new territory has been substantially discussed and well understood, this paper considers the role of the European landscape as the focus of diasporic desire. The interdisciplinary project, S2Q/Good Blood began as a social history map of Scandinavian and Nordic migration to Queensland in the nineteenth century, incorporating archival material from local collections with visual field trip data gathered in Denmark, Sweden, Norway and Finland. In 2011, some of this material found its way into the installation work, 'my mother is water, my father is wood'. What emerged from this experiment was an imaginary landscape, melding its loci through original photography and video footage in tandem with stock imagery and historical material. This juxtaposition reinforced the represented landscape as a narrative landscape and evidence of the performativity of belonging. This practitioner reflection utilizes Lynette Russell’s research into landscape archaeology to consider the significance of relationships with landscapes that are “not always empirically demonstrable.”
Resumo:
Peter S. Menell and Sarah M. Tran (ed.), Intellectual Property, Innovation and the Environment, Cheltenham (UK) and Northampton (MA): Edward Elgar, 2014, 756 pp Hardback 978 1 78195 160 6, http://www.e-elgar.com/bookentry_main.lasso?id=15063 There has been a longstanding deadlock over intellectual property and clean technologies in international climate talks. The United States — and other developed countries such as Japan, Denmark Germany, the United Kingdom, Australia, and New Zealand — have pushed for stronger and longer protection of intellectual property rights related to clean technologies. BASIC countries — such as Brazil, South Africa, India, and China — have pushed for greater flexibilities in respect of intellectual property for the purpose of addressing climate change and global warming. Small island states, least developed countries, and nations vulnerable to climate change have called for climate-adaptation and climate-mitigation technologies to be available in the public domain. In the lead-up to the United Nations Climate Summit in New York on the 23rd September 2014, it is timely to consider the debate over intellectual property, innovation, the environment, and climate change.
Resumo:
This paper considers the ongoing litigation against the peer to peer network Kazaa. Record companies and Hollywood studios have faced jurisdictional and legal problems in suing this network for copyright infringement. As Wired Magazine observes: ’The servers are in Denmark. The software is in Estonia. The domain is registered Down Under, the corporation on a tiny island in the South Pacific. The users - 60 million of them - are everywhere around the world.' In frustration, copyright owners have launched copyright actions against intermediaries - like Internet Service Providers such as Verizon. They have also embarked on filing suits of individual users of file-sharing programs. In addition, copyright owners have called for domestic and international law reform in respect of digital copyright. The Senate Committee on Government Affairs in the United States Congress has reviewed the controversial use of subpoenas in suits against users of file-sharing peer to peer networks. The United States has encouraged other countries to adopt provisions of the Digital Millennium Copyright Act 1998 (US) in bilateral and regional free trade agreements.
Resumo:
In the wake of the international summits in Copenhagen and Cancún, there is an urgent need to consider the role of intellectual property law in encouraging research, development, and diffusion of clean technologies to mitigate and adapt to the effects of climate change. This book charts the patent landscapes and legal conflicts emerging in a range of fields of innovation – including renewable forms of energy, such as solar power, wind power, and geothermal energy; as well as biofuels, green chemistry, green vehicles, energy efficiency, and smart grids. As well as reviewing key international treaties, this book provides a detailed analysis of current trends in patent policy and administration in key nation states, and offers clear recommendations for law reform. It considers such options as technology transfer, compulsory licensing, public sector licensing, and patent pools; and analyses the development of Climate Innovation Centres, the Eco-Patent Commons, and environmental prizes, such as the L-Prize, the H-Prize, and the X-Prizes. This book will have particular appeal to policy-makers given its focus upon recent legislative developments and reform proposals, as well as legal practitioners by developing a better understanding of recent legal, scientific, and business developments, and how they affect their practice. Innovators, scientists and researchers will also benefit from reading this book.
Resumo:
Legal Context In the wake of the Copenhagen Accord 2009 and the Cancun Agreements 2010, a number of patent offices have introduced fast-track mechanisms to encourage patent applications in relation to clean technologies - such as those pertaining to hydrogen. However, patent offices will be under increasing pressure to ensure that the granted patents satisfy the requisite patent thresholds, as well as to identify and reject cases of fraud, hoaxes, scams, and swindles. Key Points This article examines the BlackLight litigation in the United States, the United Kingdom, and the European Patent Office, and considers how patent offices and courts deal with patent applications in respect of clean energy and perpetual motion machines. Practical Significance The capacity of patent offices to grant sound and reliable patents is critical to the credibility of the patent system, particularly in the context of the current focus upon promoting clean technologies.
Resumo:
Ship seakeeping operability refers to the quantification of motion performance in waves relative to mission requirements. This is used to make decisions about preferred vessel designs, but it can also be used as comprehensive assessment of the benefits of ship-motion-control systems. Traditionally, operability computation aggregates statistics of motion computed over over the envelope of likely environmental conditions in order to determine a coefficient in the range from 0 to 1 called operability. When used for assessment of motion-control systems, the increase of operability is taken as the key performance indicator. The operability coefficient is often given the interpretation of the percentage of time operable. This paper considers an alternative probabilistic approach to this traditional computation of operability. It characterises operability not as a number to which a frequency interpretation is attached, but as a hypothesis that a vessel will attain the desired performance in one mission considering the envelope of likely operational conditions. This enables the use of Bayesian theory to compute the probability of that this hypothesis is true conditional on data from simulations. Thus, the metric considered is the probability of operability. This formulation not only adheres to recent developments in reliability and risk analysis, but also allows incorporating into the analysis more accurate descriptions of ship-motion-control systems since the analysis is not limited to linear ship responses in the frequency domain. The paper also discusses an extension of the approach to the case of assessment of increased levels of autonomy for unmanned marine craft.
Resumo:
This paper presents a Hamiltonian model of marine vehicle dynamics in six degrees of freedom in both body-fixed and inertial momentum coordinates. The model in body-fixed coordinates presents a particular structure of the mass matrix that allows the adaptation and application of passivity-based control interconnection and damping assignment design methodologies developed for robust stabilisation of mechanical systems in terms of generalised coordinates. As an example of application, we follow this methodology to design a passivity-based tracking controller with integral action for fully actuated vehicles in six degrees of freedom. We also describe a momentum transformation that allows an alternative model representation that resembles general port-Hamiltonian mechanical systems with a coordinate dependent mass matrix. This can be seen as an enabling step towards the adaptation of the theory of control of port-Hamiltonian systems developed in robotic manipulators and multi-body mechanical systems to the case of marine craft dynamics.
Resumo:
This paper considers the dynamic modelling and motion control of a Surface Effect Ship (SES) for safer transfer of personnel and equipment from vessel to-and-from an offshore wind-turbine. Such a vessel is a key enabling factor for operation and maintenance (O&M) of offshore wind-energy infrastructure. The control system designed is referred to as Boarding Control System (BCS). We investigate the performance of this system for a specific wind-farm service vessel–The Wave Craft. A two-modality vessel model is presented to account for the vessel free motion and motion whilst in contact with a wind-turbine. On a SES, the pressurized air cushion carries the majority of the vessel mass. The control problem considered relates to the actuation of the pressure such that wave-induced vessel motions are minimized. This leads to a safer personnel transfer in developed sea-states than what is possible today. Results for the BCS is presented through simulation and model-scale craft testing.
Resumo:
Activation of midbrain dopamine systems is thought to be critically involved in the addictive properties of abused substances. Drugs of abuse increase dopamine release in the nucleus accumbens and dorsal striatum, which are the target areas of mesolimbic and nigrostriatal dopamine pathways, respectively. Dopamine release in the nucleus accumbens is thought to mediate the attribution of incentive salience to rewards, and dorsal striatal dopamine release is involved in habit formation. In addition, changes in the function of prefrontal cortex (PFC), the target area of mesocortical dopamine pathway, may skew information processing and memory formation such that the addict pays an abnormal amount of attention to drug-related cues. In this study, we wanted to explore how long-term forced oral nicotine exposure or the lack of catechol-O-methyltransferase (COMT), one of the dopamine metabolizing enzymes, would affect the functioning of these pathways. We also wanted to find out how the forced nicotine exposure or the lack of COMT would affect the consumption of nicotine, alcohol, or cocaine. First, we studied the effect of forced chronic nicotine exposure on the sensitivity of dopamine D2-like autoreceptors in microdialysis and locomotor activity experiments. We found that the sensitivity of these receptors was unchanged after forced oral nicotine exposure, although an increase in the sensitivity was observed in mice treated with intermittent nicotine injections twice daily for 10 days. Thus, the effect of nicotine treatment on dopamine autoreceptor sensitivity depends on the route, frequency, and time course of drug administration. Second, we investigated whether the forced oral nicotine exposure would affect the reinforcing properties of nicotine injections. The chronic nicotine exposure did not significantly affect the development of conditioned place preference to nicotine. In the intravenous self-administration paradigm, however, the nicotine-exposed animals self-administered nicotine at a lower unit dose than the control animals, indicating that their sensitivity to the reinforcing effects of nicotine was enhanced. Next, we wanted to study whether the Comt gene knock-out animals would be a suitable model to study alcohol and cocaine consumption or addiction. Although previous work had shown male Comt knock-out mice to be less sensitive to the locomotor-activating effects of cocaine, the present study found that the lack of COMT did not affect the consumption of cocaine solutions or the development of cocaine-induced place preference. However, the present work did find that male Comt knock-out mice, but not female knock-out mice, consumed ethanol more avidly than their wild-type littermates. This finding suggests that COMT may be one of the factors, albeit not a primary one, contributing to the risk of alcoholism. Last, we explored the effect of COMT deficiency on dorsal striatal, accumbal, and prefrontal cortical dopamine metabolism under no-net-flux conditions and under levodopa load in freely-moving mice. The lack of COMT did not affect the extracellular dopamine concentrations under baseline conditions in any of the brain areas studied. In the prefrontal cortex, the dopamine levels remained high for a prolonged time after levodopa treatment in male, but not female, Comt knock-out mice. COMT deficiency induced accumulation of 3,4-dihydroxyphenylacetic acid, which increased further under levodopa load. Homovanillic acid was not detectable in Comt knock-out animals either under baseline conditions or after levodopa treatment. Taken together, the present results show that although forced chronic oral nicotine exposure affects the reinforcing properties of self-administered nicotine, it is not an addiction model itself. COMT seems to play a minor role in dopamine metabolism and in the development of addiction under baseline conditions, indicating that dopamine function in the brain is well-protected from perturbation. However, the role of COMT becomes more important when the dopaminergic system is challenged, such as by pharmacological manipulation.
Resumo:
The surface properties of solid state pharmaceutics are of critical importance. Processing modifies the surfaces and effects surface roughness, which influences the performance of the final dosage form in many different levels. Surface roughness has an effect on, e.g., the properties of powders, tablet compression and tablet coating. The overall goal of this research was to understand the surface structures of pharmaceutical surfaces. In this context the specific purpose was to compare four different analysing techniques (optical microscopy, scanning electron microscopy, laser profilometry and atomic force microscopy) in various pharmaceutical applications where the surfaces have quite different roughness scale. This was done by comparing the image and roughness analysing techniques using powder compacts, coated tablets and crystal surfaces as model surfaces. It was found that optical microscopy was still a very efficient technique, as it yielded information that SEM and AFM imaging are not able to provide. Roughness measurements complemented the image data and gave quantitative information about height differences. AFM roughness data represents the roughness of only a small part of the surface and therefore needs other methods like laser profilometer are needed to provide a larger scale description of the surface. The new developed roughness analysing method visualised surface roughness by giving detailed roughness maps, which showed local variations in surface roughness values. The method was able to provide a picture of the surface heterogeneity and the scale of the roughness. In the coating study, the laser profilometer results showed that the increase in surface roughness was largest during the first 30 minutes of coating when the surface was not yet fully covered with coating. The SEM images and the dispersive X-ray analysis results showed that the surface was fully covered with coating within 15 to 30 minutes. The combination of the different measurement techniques made it possible to follow the change of surface roughness and development of polymer coating. The optical imaging techniques gave a good overview of processes affecting the whole crystal surface, but they lacked the resolution to see small nanometer scale processes. AFM was used to visualize the nanoscale effects of cleaving and reveal the full surface heterogeneity, which underlies the optical imaging. Ethanol washing changed small (nanoscale) structure to some extent, but the effect of ethanol washing on the larger scale was small. Water washing caused total reformation of the surface structure at all levels.
Resumo:
Miniaturization of analytical instrumentation is attracting growing interest in response to the explosive demand for rapid, yet sensitive analytical methods and low-cost, highly automated instruments for pharmaceutical and bioanalyses and environmental monitoring. Microfabrication technology in particular, has enabled fabrication of low-cost microdevices with a high degree of integrated functions, such as sample preparation, chemical reaction, separation, and detection, on a single microchip. These miniaturized total chemical analysis systems (microTAS or lab-on-a-chip) can also be arrayed for parallel analyses in order to accelerate the sample throughput. Other motivations include reduced sample consumption and waste production as well as increased speed of analysis. One of the most promising hyphenated techniques in analytical chemistry is the combination of a microfluidic separation chip and mass spectrometer (MS). In this work, the emerging polymer microfabrication techniques, ultraviolet lithography in particular, were exploited to develop a capillary electrophoresis (CE) separation chip which incorporates a monolithically integrated electrospray ionization (ESI) emitter for efficient coupling with MS. An epoxy photoresist SU-8 was adopted as structural material and characterized with respect to its physicochemical properties relevant to chip-based CE and ESI/MS, namely surface charge, surface interactions, heat transfer, and solvent compatibility. As a result, SU-8 was found to be a favorable material to substitute for the more commonly used glass and silicon in microfluidic applications. In addition, an infrared (IR) thermography was introduced as direct, non-intrusive method to examine the heat transfer and thermal gradients during microchip-CE. The IR data was validated through numerical modeling. The analytical performance of SU-8-based microchips was established for qualitative and quantitative CE-ESI/MS analysis of small drug compounds, peptides, and proteins. The CE separation efficiency was found to be similar to that of commercial glass microchips and conventional CE systems. Typical analysis times were only 30-90 s per sample indicating feasibility for high-throughput analysis. Moreover, a mass detection limit at the low-attomole level, as low as 10E+5 molecules, was achieved utilizing MS detection. The SU-8 microchips developed in this work could also be mass produced at low cost and with nearly identical performance from chip to chip. Until this work, the attempts to combine CE separation with ESI in a chip-based system, amenable to batch fabrication and capable of high, reproducible analytical performance, have not been successful. Thus, the CE-ESI chip developed in this work is a substantial step toward lab-on-a-chip technology.