834 resultados para raw-fibre-to-end-product
Resumo:
Ecosystems can alternate suddenly between contrasting persistent states due to internal processes or external drivers. It is important to understand the mechanisms by which these shifts occur, especially in exploited ecosystems. There have been several abrupt marine ecosystem shifts attributed either to fishing, recent climate change or a combination of these two drivers. We show that temperature has been an important driver of the trophodynamics of the North Sea, a heavily fished marine ecosystem, for nearly 50 years and that a recent pronounced change in temperature established a new ecosystem dynamic regime through a series of internal mechanisms. Using an end-to-end ecosystem approach that included primary producers, primary, secondary and tertiary consumers, and detritivores, we found that temperature modified the relationships among species through nonlinearities in the ecosystem involving ecological thresholds and trophic amplifications. Trophic amplification provides an alternative mechanism to positive feedback to drive an ecosystem towards a new dynamic regime, which in this case favours jellyfish in the plankton and decapods and detritivores in the benthos. Although overfishing is often held responsible for marine ecosystem degeneration, temperature can clearly bring about similar effects. Our results are relevant to ecosystem-based fisheries management (EBFM), seen as the way forward to manage exploited marine ecosystems.
Resumo:
Exploring climate and anthropogenic impacts on marine ecosystems requires an understanding of how trophic components interact. However, integrative end-to-end ecosystem studies (experimental and/or modelling) are rare. Experimental investigations often concentrate on a particular group or individual species within a trophic level, while tropho-dynamic field studies typically employ either a bottom-up approach concentrating on the phytoplankton community or a top-down approach concentrating on the fish community. Likewise the emphasis within modelling studies is usually placed upon phytoplankton-dominated biogeochemistry or on aspects of fisheries regulation. In consequence the roles of zooplankton communities (protists and metazoans) linking phytoplankton and fish communities are typically under-represented if not (especially in fisheries models) ignored. Where represented in ecosystem models, zooplankton are usually incorporated in an extremely simplistic fashion, using empirical descriptions merging various interacting physiological functions governing zooplankton growth and development, and thence ignoring physiological feedback mechanisms. Here we demonstrate, within a modelled plankton food-web system, how trophic dynamics are sensitive to small changes in parameter values describing zooplankton vital rates and thus the importance of using appropriate zooplankton descriptors. Through a comprehensive review, we reveal the mismatch between empirical understanding and modelling activities identifying important issues that warrant further experimental and modelling investigation. These include: food selectivity, kinetics of prey consumption and interactions with assimilation and growth, form of voided material, mortality rates at different age-stages relative to prior nutrient history. In particular there is a need for dynamic data series in which predator and prey of known nutrient history are studied interacting under varied pH and temperature regimes.
Resumo:
In September 2007, observations were made of a siphonophore in surface waters and near to the seabed by sea users off south Devon and south-east Cornwall. The same siphonophore was also recorded from regular samples collected offshore of Plymouth. The species is identified as Apolemia uvaria, which had not previously been recorded off Plymouth. It was sampled until March 2008 and re-appeared, in smaller numbers, in autumn 2008 until February 2009 but was not reliably reported in autumn 2009 (to end of October). The occurrence is unlikely to be due to sea warming, but more likely some variation in oceanic currents, possibly influxes of Atlantic water
Resumo:
This study describes the formulation, characterisation and preliminary clinical evaluation of mucoadhesive, semi-solid formulations containing hydroxyethylcellulose (HEC, 1-5%, w/w), polyvinylpyrrolidine (PVP, 2 or 3%, w/w), poly carbophil (PC, 1 or 3%, w/w) and tetracycline (5%, w/w, as the hydrochloride). Each formulation was characterised in terms of drug release, hardness, compressibility, adhesiveness (using a texture analyser in texture profile analysis mode), syringeability (using a texture analyser in compression mode) and adhesion to a mucin disc (measured as a detachment force using the texture analyser in tensile mode). The release exponent for the formulations ranged from 0.78+/-0.02 to 1.27+/-0.07, indicating that drug release was non-diffusion controlled. Increasing the concentrations of each polymeric component significantly increased the time required for 10 and 30% release of the original mass of tetracycline, due to both increased viscosity and, additionally, the unique swelling properties of the formulations. Increasing concentrations of each polymeric component also increased the hardness, compressibility, adhesiveness, syringeability and mucoadhesion of the formulations. The effects on product hardness, compressibility and syringeability may be due to increased product viscosity and, hence, increased resistance to compression. Similarly, the effects of these polymers on adhesiveness/mucoadhesion highlight their mucoadhesive nature and, importantly, the effects of polymer state (particularly PC) on these properties. Thus, in formulations where the neutralisation of PC was maximally suppressed, adhesiveness and mucoadhesion were also maximal. Interestingly, statistical interactions were primarily observed between the effects of HEC and PC on drug release, mechanical and mucoadhesive properties. These were explained by the effects of HEC on the physical state of PC, namely swollen or unswollen. In the preliminary clinical evaluation, a formulation was selected that offered an appropriate balance of the above physical properties and contained 3% HEC, 3% PVP and 1% PC, in addition to tetracycline 5% (as the hydrochloride). The clinical efficacy of this (test) formulation was compared to an identical tetracycline-devoid (control) formulation in nine periodontal pockets (greater than or equal to 5 mm depth). One week following administration of the test formulation, there was a significant improvement in periodontal health as identified by reduced numbers of sub-gingival microbial pathogens. Therefore, it can be concluded that, when used in combination with mechanical plaque removal, the tetracycline-containing semi-solid systems described in this study would augment such therapy by enhancing the removal of pathogens, thus improving periodontal health. (C) 2000 Elsevier Science B.V. All rights reserved.
Resumo:
The paper is primarily concerned with the modelling of aircraft manufacturing cost. The aim is to establish an integrated life cycle balanced design process through a systems engineering approach to interdisciplinary analysis and control. The cost modelling is achieved using the genetic causal approach that enforces product family categorisation and the subsequent generation of causal relationships between deterministic cost components and their design source. This utilises causal parametric cost drivers and the definition of the physical architecture from the Work Breakdown Structure (WBS) to identify product families. The paper presents applications to the overall aircraft design with a particular focus on the fuselage as a subsystem of the aircraft, including fuselage panels and localised detail, as well as engine nacelles. The higher level application to aircraft requirements and functional analysis is investigated and verified relative to life cycle design issues for the relationship between acquisition cost and Direct Operational Cost (DOC), for a range of both metal and composite subsystems. Maintenance is considered in some detail as an important contributor to DOC and life cycle cost. The lower level application to aircraft physical architecture is investigated and verified for the WBS of an engine nacelle, including a sequential build stage investigation of the materials, fabrication and assembly costs. The studies are then extended by investigating the acquisition cost of aircraft fuselages, including the recurring unit cost and the non-recurring design cost of the airframe sub-system. The systems costing methodology is facilitated by the genetic causal cost modeling technique as the latter is highly generic, interdisciplinary, flexible, multilevel and recursive in nature, and can be applied at the various analysis levels required of systems engineering. Therefore, the main contribution of paper is a methodology for applying systems engineering costing, supported by the genetic causal cost modeling approach, whether at a requirements, functional or physical level.
Resumo:
The future convergence of voice, video and data applications on the Internet requires that next generation technology provides bandwidth and delay guarantees. Current technology trends are moving towards scalable aggregate-based systems where applications are grouped together and guarantees are provided at the aggregate level only. This solution alone is not enough for interactive video applications with sub-second delay bounds. This paper introduces a novel packet marking scheme that controls the end-to-end delay of an individual flow as it traverses a network enabled to supply aggregate- granularity Quality of Service (QoS). IPv6 Hop-by-Hop extension header fields are used to track the packet delay encountered at each network node and autonomous decisions are made on the best queuing strategy to employ. The results of network simulations are presented and it is shown that when the proposed mechanism is employed the requested delay bound is met with a 20% reduction in resource reservation and no packet loss in the network.
Resumo:
This paper presents a new packet scheduling scheme called agent-based WFQ to control and maintain QoS parameters in virtual private networks (VPNs) within the confines of adaptive networks. Future networks are expected to be open heterogeneous environments consisting of more than one network operator. In this adaptive environment, agents act on behalf of users or third-party operators to obtain the best service for their clients and maintain those services through the modification of the scheduling scheme in routers and switches spanning the VPN. In agent-based WFQ, an agent on the router monitors the accumulated queuing delay for each service. In order to control and to keep the end-to-end delay within the bounds, the weights for services are adjusted dynamically by agents on the routers spanning the VPN. If there is an increase or decrease in queuing delay of a service, an agent on a downstream router informs the upstream routers to adjust the weights of their queues. This keeps the end-to-end delay of services within the specified bounds and offers better QoS compared to VPNs using static WFQ. This paper also describes the algorithm for agent-based WFQ, and presents simulation results. (C) 2003 Elsevier Science Ltd. All rights reserved.
Resumo:
Energetic costs of fighting, such as high lactate or low glucose, have been shown in a range of species to correlate with the decisions made by each opponent, particularly the decision by one opponent, the 'loser', to end the fight by 'giving up'. Studies based on complete fights of differing duration, however, do not provide information on the changes in the physiological correlates of fighting that may take place during the course of the encounter, or how these changes may influence the capability and decisions of the contestants. We interrupted fights between hermit crabs, Pagurus bernhardus, at specific points, and related energy status to the preceding activities. Costs rose quickly with a rapid accumulation of lactic acid in attackers and declining muscular glycogen in defenders. Changes in physiological status appeared much earlier than the changes in behaviour that they may have caused. Furthermore, some physiological changes might have been an effect, rather than the cause, of fight decisions. (c) 2005 The Association for the Study of Animal Behaviour Published by Elsevier Ltd. All rights reserved.
Resumo:
Software product development is recognised as difficult due to the intangible nature of the product, requirements elicitation, effective progress measurement, and so forth. In this paper, we describe some of the challenges of software product development and how the challenges are being met by lean management principles and techniques. Specifically, we examine lean principles and techniques that were devised by Toyota and other manufacturers over the last 50 years. Applying lean principles to software development projects has been advocated for over ten years and it will be shown that the extensive lean literature is a valuable source of ideas for software development. A case study with a software development organisation, Timberline Inc., will demonstrate that lean principles and techniques can be successfully applied to software product development.
Resumo:
What-if Simulations have been identified as one solution for business performance related decision support. Such support is especially useful in cases where it can be automatically generated out of Business Process Management (BPM) Environments from the existing business process models and performance parameters monitored from the executed business process instances. Currently, some of the available BPM Environments offer basic-level performance prediction capabilities. However, these functionalities are normally too limited to be generally useful for performance related decision support at business process level. In this paper, an approach is presented which allows the non-intrusive integration of sophisticated tooling for what-if simulations, analytic performance prediction tools process optimizations or a combination Of Such solutions into already existing BPM environments. The approach abstracts from process modelling techniques which enable automatic decision support spanning processes across numerous BPM Environments. For instance, this enables end-to-end decision support for composite processes modelled with the Business Process Modelling Notation (BPMN) on top of existing Enterprise Resource Planning (ERP) processes modelled with proprietary languages.
Resumo:
The kinetics of the acid-catalysed hydrolysis of cellobiose in the ionic liquid 1-ethyl-3-methylimidazolium chloride, [C(2)mim]Cl, was studied as a model for general lignocellulosic biomass hydrolysis in ionic liquid systems. The results show that the rate of the two competing reactions, polysaccharide hydrolysis and sugar decomposition, vary with acid strength, and that for acids with an aqueous pK(a) below approximately zero, the hydrolysis reaction is significantly faster than the degradation of glucose, thus allowing hydrolysis to be performed with a high selectivity in glucose. In tests with soluble cellulose, hemicellulose (xylan), and lignocellulosic biomass (Miscanthus grass), comparable hydrolysis rates were observed with bond scission occurring randomly along the biopolymer chains, in contrast to end-group hydrolysis observed with aqueous acids.
Resumo:
A new configurable architecture is presented that offers multiple levels of video playback by accommodating variable levels of network utilization and bandwidth. By utilizing scalable MPEG-4 encoding at the network edge and using specific video delivery protocols, media streaming components are merged to fully optimize video playback for IPv6 networks, thus improving QoS. This is achieved by introducing “programmable network functionality” (PNF) which splits layered video transmission and distributes it evenly over available bandwidth, reducing packet loss and delay caused by out-of-profile DiffServ classes. An FPGA design is given which gives improved performance, e.g. link utilization, end-to-end delay, and that during congestion, improves on-time delivery of video frames by up to 80% when compared to current “static” DiffServ.
Resumo:
Semi-solid forming processes such as thermoforming and injection blow moulding are used to make much of today’s packaging. As for most packaging there is a drive to reduce product weight and improve properties such as barrier performance. Polymer nanocomposites offer the possibility of increased modulus
(and hence potential product light weighting) as well as improved barrier properties and are the subject of much research attention. In this particular study, polypropylene–clay nanocomposite sheets produced via biaxial deformation are investigated and the structure of the nanocomposites is quantitatively determined in order to gain a better understanding of the influence of the composite structure on mechanical properties. Compression moulded sheets of polypropylene and polypropylene/Cloisite 15A nanocomposite (5 wt.%) were biaxially stretched to different stretching ratios, and then the structure of
the nanocomposite was examined using XRD and TEM techniques. Different stretching ratios produced different degrees of exfoliation and orientation of the clay tactoids. The sheet properties were then investigated using DSC, DMTA, and tensile tests .It was found that regardless of the degree of exfoliation or
orientation, the addition of clay has no effect on percentage crystallinity or melting temperature, but it has an effect on the crystallization temperature and on the crystal size distribution. DMTA and tensile tests show that both the degree of exfoliation and the degree of orientation positively correlate with the dynamic mechanical properties and the tensile properties of the sheet.
Resumo:
Cooperative MIMO (Multiple Input–Multiple Output) allows multiple nodes share their antennas to emulate antenna arrays and transmit or receive cooperatively. It has the ability to increase the capacity for future wireless communication systems and it is particularly suited for ad hoc networks. In this study, based on the transmission procedure of a typical cooperative MIMO system, we first analyze the capacity of single-hop cooperative MIMO systems, and then we derive the optimal resource allocation strategy to maximize the end-to-end capacity in multi-hop cooperative MIMO systems. The study shows three implications. First, only when the intra-cluster channel is better than the inter-cluster channel, cooperative MIMO results in a capacity increment. Second, for a given scenario there is an optimal number of cooperative nodes. For instance, in our study an optimal deployment of three cooperative nodes achieve a capacity increment of 2 bps/Hz when compared with direct transmission. Third, an optimal resource allocation strategy plays a significant role in maximizing end-to-end capacity in multi-hop cooperative MIMO systems. Numerical results show that when optimal resource allocation is applied we achieve more than 20% end-to-end capacity increment in average when compared with an equal resource allocation strategy.
Resumo:
The Perils of Moviegoing in America is a film history that examines the various physical and (perceived) moral dangers facing audiences during the first fifty years of film exhibition.
Chapter 1: “Conflagration”
As early as 1897, a major fire broke out at a film exhibition in San Francisco, with flames burning the projectionist and nearby audience members. From that point until the widespread adoption of safety stock in 1950, fires were a very common movie-going experience. Hundreds of audience members lost their lives in literally thousands of theatre fires, ranging from early nickelodeons to the movie palaces of the thirties and forties.
Chapter 2: “Thieves Among Us”
Bandits robbed movie theatres on hundreds of occasions from the early days of film exhibition through the end of the Great Depression. They held up ticket booths, and they dynamited theatre safes. They also shot theatre managers, ushers, and audience members, as a great many of the robberies occurred while movies were playing on the screens inside.
Chapter 3: “Bombs Away”
Bombings at movie theatres became common in small towns and large cities on literally hundreds of occasions from 1914 to the start of World War II. Some were incendiary bombs, and some were stench bombs; both could be fatal, whether due to explosions or to the trampling of panicked moviegoers
Chapter 4: “It’s Catching”
Widespread movie-going in the early 20th century provoked an outcry from numerous doctors and optometrists who believed that viewing films could do irreparable harm to the vision of audience members. Medical publications (including the Journal of the American Medical Association) published major studies on this perceived problem, which then filtered into popular-audience magazines and newspapers.
Chapter 5: “The Devil’s Apothecary Shops”
Sitting in the dark with complete strangers proved worrisome for many early filmgoers, who had good reason to be concerned. Darkness meant that prostitutes could easily work in the balconies of some movie theatres, as could “mashers” who molested female patrons (and sometimes children) after the lights were dimmed. That was all in addition to the various murderers who used the cover of darkness to commit their crimes at movie theatres.
Chapter 6: “Blue Sundays”
Blue laws were those regulations that prohibited businesses from operating on Sundays. Most communities across the US had such legislation on their books, which by the nickelodeon era were at odds with the thousands of filmgoers who went to the movies every Sunday. Theatre managers were often arrested, making newspaper headlines over and over again. Police sometimes even arrested entire film audiences as accomplices in the Blue Law violations.
Chapter 7: “Something for Nothing”
In an effort to bolster ticket sales, many movie theatres in the 1910s began to hold lotteries in which lucky audience members won cash prizes; by the time of the Great Depression, lotteries like “Bank Night” became a common aspect of the theatre-going enterprise. However, reception studies have generally overlooked the intense (and sometimes coordinated) efforts by police, politicians, and preachers to end this practice, which they viewed as illegal and immoral gambling.