975 resultados para new cycle
Resumo:
The biphasic (pelagobenthic) life cycle is found throughout the animal kingdom, and includes gametogenesis, embryogenesis, and metamorphosis. From a tangled web of hypotheses on the origin and evolution of the metazoan pelagobenthic life cycle, current opinion appears to favor a simple, larval-like holopelagic ancestor that independently settled multiple times to incorporate a benthic phase into the life cycle. This hypothesis derives originally from Haeckel's (1874) Gastraea theory of ontogeny recapitulating phylogeny, in which the gastrula is viewed as the recapitulation of a gastracan ancestor that evolved via selection on a simple, planktonic hollow ball of cells to develop the capacity to feed. Here, we propose an equally plausible hypothesis that the origin of the metazoan pelagobenthic life cycle was a direct consequence of sexual reproduction in a likely holobenthic ancestor. In doing so, we take into account new insights from poriferan development and from molecular phylogenies. In this scenario, the gastrula does not represent a recapitulation, but simply an embryological stage that is an outcome of sexual reproduction. The embryo can itself be considered as the precursor to a biphasic lifestyle, with the embryo representing one phase and the adult another phase. This hypothesis is more parsimonious because it precludes the need for multiple, independent origins of the benthic form. It is then reasonable to consider that multilayered, ciliated embryos ultimately released into the water column are subject to natural selection for dispersal/longevity/feeding that sets them on the evolutionary trajectory towards the crown metazoan planktonic larvae. These new insights from poriferan development thus clearly support the intercalation hypothesis of bilaterian larval evolution, which we now believe should be extended to discussions of the origin of biphasy in the metazoan last common ancestor.
Resumo:
Little is known of the blood parasites of coral reef fishes and nothing of how they are transmitted. We examined 497 fishes from 22 families, 47 genera, and 78 species captured at Lizard Island, Australia, between May 1997 and April 2003 for hematozoa and ectoparasites. We also investigated whether gnathiid isopods might serve as potential vectors of fish hemogregarines. Fifty-eight of 124 fishes caught in March 2002 had larval gnathiid isopods, up to 80 per host fish, and these were identified experimentally to be of 2 types, Gnathia sp. A and Gnathia sp. B. Caligid copepods were also recorded but no leeches. Hematozoa, found in 68 teleosts, were broadly hemogregarines of 4 types and an infection resembling Haemohormidium. Mixed infections (hemogregarine with Haemohormidium) were also observed, but no trypanosomes were detected in blood films. The hemogregarines were identified as Haemogregarina balistapi n. sp., Haemogregarina tetraodontis, possibly Haemogregarina bigemina, and an intraleukocytic hemogregarine of uncertain status. Laboratory-reared Gnathia sp. A larvae, fed experimentally on bruslitail tangs, the latter heavily infected with the H. bigemina-like hemogregarine, contained hemogregarine gamonts and possibly young oocysts up to 3 days postfeeding, but no firm evidence that gnathiids transmit hemogregarines at Lizard Island was obtained.
Resumo:
This paper reviews nitrogen (N) cycle of effluent-irrigated energy crop plantations, starting from wastewater treatment to thermo-chemical conversion processes. In wastewater, N compounds contribute to eutrophication and toxicity in water cycle. Removal of N via vegetative filters and specifically in short-rotation energy plantations, is a relatively new approach to managing nitrogenous effluents. Though combustion of energy crops is in principle carbon neutral, in practice, N content may contribute to NOx emissions with significant global warming potential. Intermediate pyrolysis produces advanced fuels while reducing such emissions. By operating at intermediate temperature (500°C), it retains most N in char as pyrrolic-N, pyridinic-N, quaternary-N and amines. In addition, biochar provides long-term sequestration of carbon in soils.
Resumo:
The objective of the thesis was to analyse several process configurations for the production of electricity from biomass. Process simulation models using AspenPlus aimed at calculating the industrial performance of power plant concepts were built, tested, and used for analysis. The criteria used in analysis were performance and cost. All of the advanced systems appear to have higher efficiencies than the commercial reference, the Rankine cycle. However, advanced systems typically have a higher cost of electricity (COE) than the Rankine power plant. High efficiencies do not reduce fuel costs enough to compensate for the high capital costs of advanced concepts. The successful reduction of capital costs would appear to be the key to the introduction of the new systems. Capital costs account for a considerable, often dominant, part of the cost of electricity in these concepts. All of the systems have higher specific investment costs than the conventional industrial alternative, i.e. the Rankine power plant; Combined beat and power production (CUP) is currently the only industrial area of application in which bio-power costs can be considerably reduced to make them competitive. Based on the results of this work, AsperiPlus is an appropriate simulation platform. How-ever, the usefulness of the models could be improved if a number of unit operations were modelled in greater detail. The dryer, gasifier, fast pyrolysis, gas engine and gas turbine models could be improved.
Resumo:
The soil-plant-moisture subsystem is an important component of the hydrological cycle. Over the last 20 or so years a number of computer models of varying complexity have represented this subsystem with differing degrees of success. The aim of this present work has been to improve and extend an existing model. The new model is less site specific thus allowing for the simulation of a wide range of soil types and profiles. Several processes, not included in the original model, are simulated by the inclusion of new algorithms, including: macropore flow; hysteresis and plant growth. Changes have also been made to the infiltration, water uptake and water flow algorithms. Using field data from various sources, regression equations have been derived which relate parameters in the suction-conductivity-moisture content relationships to easily measured soil properties such as particle-size distribution data. Independent tests have been performed on laboratory data produced by Hedges (1989). The parameters found by regression for the suction relationships were then used in equations describing the infiltration and macropore processes. An extensive literature review produced a new model for calculating plant growth from actual transpiration, which was itself partly determined by the root densities and leaf area indices derived by the plant growth model. The new infiltration model uses intensity/duration curves to disaggregate daily rainfall inputs into hourly amounts. The final model has been calibrated and tested against field data, and its performance compared to that of the original model. Simulations have also been carried out to investigate the effects of various parameters on infiltration, macropore flow, actual transpiration and plant growth. Qualitatively comparisons have been made between these results and data given in the literature.
Resumo:
Back in 2003, we published ‘MAX’ randomisation, a process of non-degenerate saturation mutagenesis using exactly 20 codons (one for each amino acid) or else any required subset of those 20 codons. ‘MAX’ randomisation saturates codons located in isolated positions within a protein, as might be required in enzyme engineering, or else on one face of an alpha-helix, as in zinc finger engineering. Since that time, we have been asked for an equivalent process that can saturate multiple, contiguous codons in a non-degenerate manner. We have now developed ‘ProxiMAX’ randomisation, which does just that: generating DNA cassettes for saturation mutagenesis without degeneracy or bias. Offering an alternative to trinucleotide phosphoramidite chemistry, ProxiMAX randomisation uses nothing more sophisticated than unmodified oligonucleotides and standard molecular biology reagents. Thus it requires no specialised chemistry, reagents nor equipment and simply relies on a process of saturation cycling comprising ligation, amplification and digestion for each cycle. The process can encode both unbiased representation of selected amino acids or else encode them in pre-defined ratios. Each saturated position can be defined independently of the others. We demonstrate accurate saturation of up to 11 contiguous codons. As such, ProxiMAX randomisation is particularly relevant to antibody engineering.
Resumo:
Back in 2003, we published ‘MAX’ randomisation, a process of non-degenerate saturation mutagenesis using exactly 20 codons (one for each amino acid) or else any required subset of those 20 codons. ‘MAX’ randomisation saturates codons located in isolated positions within a protein, as might be required in enzyme engineering, or else on one face of an alpha-helix, as in zinc finger engineering. Since that time, we have been asked for an equivalent process that can saturate multiple, contiguous codons in a non-degenerate manner. We have now developed ‘ProxiMAX’ randomisation, which does just that: generating DNA cassettes for saturation mutagenesis without degeneracy or bias. Offering an alternative to trinucleotide phosphoramidite chemistry, ProxiMAX randomisation uses nothing more sophisticated than unmodified oligonucleotides and standard molecular biology reagents. Thus it requires no specialised chemistry, reagents nor equipment and simply relies on a process of saturation cycling comprising ligation, amplification and digestion for each cycle. The process can encode both unbiased representation of selected amino acids or else encode them in pre-defined ratios. Each saturated position can be defined independently of the others. We demonstrate accurate saturation of up to 11 contiguous codons. As such, ProxiMAX randomisation is particularly relevant to antibody engineering.
Resumo:
This research aims at assessing the environmental impact of the poultry supply chain from cradle to grave using case study research and also life cycle assessment (LCA). While a limited number of generic poultry production LCA studies have been published, fewer yet assess the whole process of a specific organisation, none comparing the increased impact of further processing. Our results show that irrespectively of the impact assessment method utilised, the process of producing portions is considerably higher in total environmental impact due to the extra raw material required to produce the same mass into retail. Our research contributes to the growing number of LCA studies and could be used by practitioners for comparison against national and international averages. From a theoretical point of view, this research provides new insights into the relationship between vertically integrated supply chains and environmental performance which has not been examined in the past.
Resumo:
This paper presents for the first time the concept of measurement assisted assembly (MAA) and outlines the research priorities of the realisation of this concept in the industry. MAA denotes a paradigm shift in assembly for high value and complex products and encompasses the development and use of novel metrology processes for the holistic integration and capability enhancement of key assembly and ancillary processes. A complete framework for MAA is detailed showing how this can facilitate a step change in assembly process capability and efficiency for large and complex products, such as airframes, where traditional assembly processes exhibit the requirement for rectification and rework, use inflexible tooling and are largely manual, resulting in cost and cycle time pressures. The concept of MAA encompasses a range of innovativemeasurement- assisted processes which enable rapid partto- part assembly, increased use of flexible automation, traceable quality assurance and control, reduced structure weight and improved levels of precision across the dimensional scales. A full scale industrial trial of MAA technologies has been carried out on an experimental aircraft wing demonstrating the viability of the approach while studies within 140 smaller companies have highlighted the need for better adoption of existing process capability and quality control standards. The identified research priorities for MAA include the development of both frameless and tooling embedded automated metrology networks. Other research priorities relate to the development of integrated dimensional variation management, thermal compensation algorithms as well as measurement planning and inspection of algorithms linking design to measurement and process planning. © Springer-Verlag London 2013.
Resumo:
The purpose of this study was to assess the prevalence of bullying and victimization in a metropolitan area. A cross-sectional study with kindergarten (n = 127) and first grade (n = 126) children was conducted in two Miami-Dade County Public Schools and three private schools in the same area. Bullying and victimization behavior and social acceptance were assessed through peer nomination and the mental health outcomes of depression and anxiety were assessed through children's self-report. Teachers and parents also completed a social behavior scale for each child. Three areas of analyses were conducted pertaining to membership classification of social roles and the social acceptance and mental health outcomes associated with those roles, reporter agreement within the social roles, and the psychometric properties of the Childhood Social Behavior Scale. Results showed an overall negative pattern of adjustment for children identified as a member of any of the negative social roles. Also, the results support a new analytic approach to the investigation of social roles. The implication of these findings for early identification, social policy, and effective prevention strategies are discussed. ^
Resumo:
Waikiki, Hawaii, faces declining tourism numbers, sinking property values, and possibly a destination entering the decline phase of the tourism life cycle. Seeking the advice of world renowned planners, it has set its sights on a new master plan aimed at correcting much that seems to have gone wrong
Resumo:
This dissertation studies the context-aware application with its proposed algorithms at client side. The required context-aware infrastructure is discussed in depth to illustrate that such an infrastructure collects the mobile user’s context information, registers service providers, derives mobile user’s current context, distributes user context among context-aware applications, and provides tailored services. The approach proposed tries to strike a balance between the context server and mobile devices. The context acquisition is centralized at the server to ensure the reusability of context information among mobile devices, while context reasoning remains at the application level. Hence, a centralized context acquisition and distributed context reasoning are viewed as a better solution overall. The context-aware search application is designed and implemented at the server side. A new algorithm is proposed to take into consideration the user context profiles. By promoting feedback on the dynamics of the system, any prior user selection is now saved for further analysis such that it may contribute to help the results of a subsequent search. On the basis of these developments at the server side, various solutions are consequently provided at the client side. A proxy software-based component is set up for the purpose of data collection. This research endorses the belief that the proxy at the client side should contain the context reasoning component. Implementation of such a component provides credence to this belief in that the context applications are able to derive the user context profiles. Furthermore, a context cache scheme is implemented to manage the cache on the client device in order to minimize processing requirements and other resources (bandwidth, CPU cycle, power). Java and MySQL platforms are used to implement the proposed architecture and to test scenarios derived from user’s daily activities. To meet the practical demands required of a testing environment without the impositions of a heavy cost for establishing such a comprehensive infrastructure, a software simulation using a free Yahoo search API is provided as a means to evaluate the effectiveness of the design approach in a most realistic way. The integration of Yahoo search engine into the context-aware architecture design proves how context aware application can meet user demands for tailored services and products in and around the user’s environment. The test results show that the overall design is highly effective, providing new features and enriching the mobile user’s experience through a broad scope of potential applications.
Resumo:
The purpose of this study was to assess the prevalence of bullying and victimization in a metropolitan area. A cross-sectional study with kindergarten (n = 127) and first grade (n = 126) children was conducted in two Miami-Dade County Public Schools and three private schools in the same area. Bullying and victimization behavior and social acceptance were assessed through peer nomination and the mental health outcomes of depression and anxiety were assessed through children's self-report. Teachers and parents also completed a social behavior scale for each child. Three areas of analyses were conducted pertaining to membership classification of social roles and the social acceptance and mental health outcomes associated with those roles, reporter agreement within the social roles, and the psychometric properties of the Childhood Social Behavior Scale. Results showed an overall negative pattern of adjustment for children identified as a member of any of the negative social roles. Also, the results support a new analytic approach to the investigation of social roles. The implication of these findings for early identification, social policy, and effective prevention strategies are discussed.
Resumo:
The response of natural CH4 sources to climate changes will be an important factor to consider as concentrations of this potent greenhouse gas continue to increase. Polar ice cores provide the means to assess this sensitivity in the past and have shown a close connection between CH4 levels and northern hemisphere temperature variability over the last glacial cycle. However, the contribution of the various CH4 sources and sinks to these changes is still a matter of debate. Contemporaneous stable CH4 isotope records in ice cores provide additional boundary conditions for assessing changes in the CH4 sources and sinks. Here we present new ice core CH4 isotope data covering the last 160,000 years, showing a clear decoupling between CH4 loading and carbon isotopic variations over most of the record. We suggest that d13CH4 variations were not dominated by a change in the source mix but rather by climate- and CO2-related ecosystem control on the isotopic composition of the methane precursor material, especially in seasonally inundated wetlands in the tropics. In contrast, relatively stable d13CH4 intervals occurred during large CH4 loading changes concurrently with past climate changes implying that most CH4 sources (most notably tropical wetlands) responded simultaneously.