971 resultados para Cash Investments Are Required For Restaurant Purchases


Relevância:

100.00% 100.00%

Publicador:

Resumo:

Background: Hepatorenal tyrosinaemia (Tyr 1) is a rare inborn error of tyrosine metabolism. Without treatment, patients are at high risk of developing acute liver failure, renal dysfunction and in the long run hepatocellular carcinoma. The aim of our study was to collect cross-sectional data. Methods. Via questionnaires we collected retrospective data of 168 patients with Tyr 1 from 21 centres (Europe, Turkey and Israel) about diagnosis, treatment, monitoring and outcome. In a subsequent consensus workshop, we discussed data and clinical implications. Results: Early treatment by NTBC accompanied by diet is essential to prevent serious complications such as liver failure, hepatocellular carcinoma and renal disease. As patients may remain initially asymptomatic or develop uncharacteristic clinical symptoms in the first months of life newborn mass screening using succinylacetone (SA) as a screening parameter in dried blood is mandatory for early diagnosis. NTBC-treatment has to be combined with natural protein restriction supplemented with essential amino acids. NTBC dosage should be reduced to the minimal dose allowing metabolic control, once daily dosing may be an option in older children and adults in order to increase compliance. Metabolic control is judged by SA (below detection limit) in dried blood or urine, plasma tyrosine (<400 μM) and NTBC-levels in the therapeutic range (20-40 μM). Side effects of NTBC are mild and often transient. Indications for liver transplantation are hepatocellular carcinoma or failure to respond to NTBC. Follow-up procedures should include liver and kidney function tests, tumor markers and imaging, ophthalmological examination, blood count, psychomotor and intelligence testing as well as therapeutic monitoring (SA, tyrosine, NTBC in blood). Conclusion: Based on the data from 21 centres treating 168 patients we were able to characterize current practice and clinical experience in Tyr 1. This information could form the basis for clinical practice recommendations, however further prospective data are required to underpin some of the recommendations.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Realizing scalable performance on high performance computing systems is not straightforward for single-phenomenon codes (such as computational fluid dynamics [CFD]). This task is magnified considerably when the target software involves the interactions of a range of phenomena that have distinctive solution procedures involving different discretization methods. The problems of addressing the key issues of retaining data integrity and the ordering of the calculation procedures are significant. A strategy for parallelizing this multiphysics family of codes is described for software exploiting finite-volume discretization methods on unstructured meshes using iterative solution procedures. A mesh partitioning-based SPMD approach is used. However, since different variables use distinct discretization schemes, this means that distinct partitions are required; techniques for addressing this issue are described using the mesh-partitioning tool, JOSTLE. In this contribution, the strategy is tested for a variety of test cases under a wide range of conditions (e.g., problem size, number of processors, asynchronous / synchronous communications, etc.) using a variety of strategies for mapping the mesh partition onto the processor topology.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Aerodynamic generation of sound is governed by the Navier–Stokes equations while acoustic propagation in a non-uniform medium is effectively described by the linearised Euler equations. Different numerical schemes are required for the efficient solution of these two sets of equations, and therefore, coupling techniques become an essential issue. Two types of one-way coupling between the flow solver and the acoustic solver are discussed: (a) for aerodynamic sound generated at solid surfaces, and (b) in the free stream. Test results indicate how the coupling achieves the necessary accuracy so that Computational Fluid Dynamics codes can be used in aeroacoustic simulations.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The first phase in the sign, development and implementation of a comprehensive computational model of a copper stockpile leach process is presented. The model accounts for transport phenomena through the stockpile, reaction kinetics for the important mineral species, oxgen and bacterial effects on the leach reactions, plus heat, energy and acid balances for the overall leach process. The paper describes the formulation of the leach process model and its implementation in PHYSICA+, a computational fluid dynamic (CFD) software environment. The model draws on a number of phenomena to represent the competing physical and chemical features active in the process model. The phenomena are essentially represented by a three-phased (solid liquid gas) multi-component transport system; novel algorithms and procedures are required to solve the model equations, including a methodology for dealing with multiple chemical species with different reaction rates in ore represented by multiple particle size fractions. Some initial validation results and application simulations are shown to illustrate the potential of the model.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Problems in the preservation of the quality of granular material products are complex and arise from a series of sources during transport and storage. In either designing a new plant or, more likely, analysing problems that give rise to product quality degradation in existing operations, practical measurement and simulation tools and technologies are required to support the process engineer. These technologies are required to help in both identifying the source of such problems and then designing them out. As part of a major research programme on quality in particulate manufacturing computational models have been developed for segregation in silos, degradation in pneumatic conveyors, and the development of caking during storage, which use where possible, micro-mechanical relationships to characterize the behaviour of granular materials. The objective of the work presented here is to demonstrate the use of these computational models of unit processes involved in the analysis of large-scale processes involving the handling of granular materials. This paper presents a set of simulations of a complete large-scale granular materials handling operation, involving the discharge of the materials from a silo, its transport through a dilute-phase pneumatic conveyor, and the material storage in a big bag under varying environmental temperature and humidity conditions. Conclusions are drawn on the capability of the computational models to represent key granular processes, including particle size segregation, degradation, and moisture migration caking.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

There are increasing demands on the power density and efficiency of DC-DC power converters due to the soaring functionality and operational longevity required for today's electronic products. In addition, DC-DC converters are required to operate at new elevated frequencies in the MHz frequency regime. Typical ferrite cores, whose useable flux density falls drastically at these frequencies, have to be replaced and a method of producing compact component windings developed. In this study, two types of microinductors, pot-core and solenoid, for DC-DC converter applications have been analyzed for their performance in the MHz frequency range. The inductors were manufactured using an adapted UV-LIGA process and included electrodeposited nickel-iron and the commercial alloy Vitrovac 6025 as core materials. Using a vibrating sample magnetometer (VSM) and a Hewlett Packard 4192A LF- impedance analyzer, the inductor characteristics such as power density, efficiency, inductance and Q-factor were recorded. Experimental, finite element and analytical results were used to assess the suitability of the magnetic materials and component geometries for low MHz operation.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Vacuum Arc Remelting (VAR) is the accepted method for producing homogeneous, fine microstructures that are free of inclusions required for rotating grade applications. However, as ingot sizes are increasing INCONEL 718 becomes increasingly susceptible to defects such as freckles, tree rings, and white spots increases for large diameter billets. Therefore, predictive models of these defects are required to allow optimization of process parameters. In this paper, a multiscale and multi-physics model is presented to predict the development of microstructures in the VAR ingot during solidification. At the microscale, a combined stochastic nucleation approach and finite difference solution of the solute diffusion is applied in the semi-solid zone of the VAR ingot. The micromodel is coupled with a solution of the macroscale heat transfer, fluid flow and electromagnetism in the VAR process through the temperature, pressure and fluid flow fields. The main objective of this study is to achieve a better understanding of the formation of the defects in VAR by quantifying the influence of VAR processing parameters on grain nucleation and dendrite growth. In particular, the effect of different ingot growth velocities on the microstructure formation was investigated. It was found that reducing the velocity produces significantly more coarse grains.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Induced by a literature review, this paper presents a framework of dimensions and indicators highlighting the underpinning aspects and values of social learning within teacher groups. Notions of social networks, communities of practice and learning teams were taken as the main perspectives to influence this social learning framework. The review exercise resulted in four dimensions: (1) practice, (2) domain and value creation, (3) collective identity and (4) organization. The indicators corresponding to these dimensions serve as the foundation for understanding social learning in practice. The framework of dimensions and indicators can be of assistance for researchers as well as teacher groups that aim to assess their views on social learning and analyse whether these views fit the learning goals of the group, or that adjustments are required. In this way, learning processes within groups of teachers can be improved.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Meiofaunal organisms are mobile multicellular animals that are smaller than macrofauna and larger than microfauna. The size boundaries of meiofauna are generally based on the standardised mesh apertures of sieves with 500 μm (or 1000 μm) as upper and 63 μm (or 42 μm) as lower limits. Meiofauna are ubiquitous, inhabiting most marine substrata, often in high densities. Meiofauna are highly diverse, and several phyla are only known to occur as meiofauna. Owing to their small size and high densities, specialised techniques are required to collect, preserve and examine meiofauna. These are described, along with approaches to determine biomass of these small animals. Their small size also makes them useful candidates for manipulative experiments, and culturing of individual species and approaches to experiments on whole communities are briefly discussed.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Local-scale planning decisions are required by the existing Environmental Impact Assessment process to take account of the implications of a development on a range of environmental and social factors, and could therefore be supported by an ecosystem services approach. However, empirical assessments at a local scale within the marine environment have focused on only a single or limited set of services. This paper tests the applicability of the ecosystem services approach to environmental impact appraisal by considering how the identification and quantification of a comprehensive suite of benefits provided at a local scale might proceed in practice. A methodology for conducting an Environmental Benefits Assessment (EBA) is proposed, the underlying framework for which follows the recent literature by placing the emphasis on ecosystem benefits, as opposed to services. The EBA methodology also proposes metrics that can be quantified at local scale, and is tested using a case study of a hypothetical tidal barrage development in the Taw Torridge estuary in North Devon, UK. By suggesting some practical steps for assessing environmental benefits, this study aims to stimulate discussion and so advance the development of methods for implementing ecosystem service approaches at a local scale.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Frequent locations of thermal fronts in UK shelf seas were identified using an archive of 30,000 satellite images acquired between 1999 and 2008, and applied as a proxy for pelagic diversity in the designation of Marine Protected Areas (MPAs). Networks of MPAs are required for conservation of critical marine habitats within Europe, and there are similar initiatives worldwide. Many pelagic biodiversity hotspots are related to fronts, for example cetaceans and basking sharks around the Isle of Man, Hebrides and Cornwall, and hence remote sensing can address this policy need in regions with insufficient species distribution data. This is the first study of UK Continental Shelf front locations to use a 10-year archive of full-resolution (1.1 km) AVHRR data, revealing new aspects of their spatial and seasonal variability. Frontal locations determined at sea or predicted by ocean models agreed closely with the new frequent front maps, which also identified many additional frontal zones. These front maps were among the most widely used datasets in the recommendation of UK MPAs, and would be applicable to other geographic regions and to other policy drivers such as facilitating the deployment of offshore renewable energy devices with minimal environmental impact.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Eutrophication is a process resulting from an increase in anthropogenic nutrient inputs from rivers and other sources, the consequences of which can include enhanced algal biomass, changes in plankton community composition and oxygen depletion near the seabed. Within the context of the Marine Strategy Framework Directive, indicators (and associated threshold) have been identified to assess the eutrophication status of an ecosystem. Large databases of observations (in situ) are required to properly assess the eutrophication status. Marine hydrodynamic/ecosystem models provide continuous fields of a wide range of ecosystem characteristics. Using such models in this context could help to overcome the lack of in situ data, and provide a powerful tool for ecosystem-based management and policy makers. Here we demonstrate a methodology that uses a combination of model outputs and in situ data to assess the risk of eutrophication in the coastal domain of the North Sea. The risk of eutrophication is computed for the past and present time as well as for different future scenarios. This allows us to assess both the current risk and its sensitivity to anthropogenic pressure and climate change. Model sensitivity studies suggest that the coastal waters of the North Sea may be more sensitive to anthropogenic rivers loads than climate change in the near future (to 2040).

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The aim of this paper is to investigate the role of phytoplankton nutritional status in the formation of the spring bloom regularly observed at the station L4 in the Western English Channel. Using a modelling approach, we tested the hypothesis that the increase in light from winter to spring induces a decrease in diatom nutritional status (i.e., an increase in the C:N and C:P ratios), thereby reducing their palatability and allowing them to bloom. To this end, a formulation describing the Stoichiometric Modulation of Predation (SMP) has been implemented in a simplified version of the European Regional Seas Ecosystem Model (ERSEM). The model was coupled with the General Ocean Turbulence Model (GOTM), implemented at the station L4 and run for 10 years (2000-.2009). Simulated carbon to nutrient ratios in diatoms were analysed in relation to microzooplankton biomass, grazing and assimilation efficiency. The model reproduced in situ data evolutions and showed the importance of microzooplankton grazing in controlling the early onset of the bloom. Simulation results supported our hypothesis and provided a conceptual model explaining the formation of the diatom spring bloom in the investigated area. However, additional data describing the microzooplankton grazing impact and the variation of carbon to nutrient ratios inside phytoplanktonic cells are required to further validate the proposed mechanisms.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Semi-enclosed bays in upwelling regions are exposed to forcing related to winds, currents and buoyancy over the shelf. The influence of this external forcing is moderated by factors such as connectivity to the open ocean, shelter by surrounding topography, dimensions of the bay, and freshwater outflows. Such bays, preferred locations for ports, mariculture, marine industry, recreational activities and coastal settlement, present a range of characteristics, understanding of which is necessary to their rational management. Observations in such a semi-enclosed bay, the Ria de Vigo in Spain, are used to characterize the influence of upwelling and downwelling pulses on its circulation. In this location, near the northern limit of the Iberian upwelling system, upwelling events dominate during a short summer season and downwelling events the rest of the year. The ria response to the external forcing is central to nutrient supply and resultant plankton productivity that supports its high level of cultured mussel production. Intensive field studies in September 2006 and June 2007 captured a downwelling event and an upwelling event, respectively. Data from eight current profiler moorings and boat-based MiniBat/ADCP surveys provided an unprecedented quasi-synoptic view of the distribution of water masses and circulation patterns in any ria. In the outer ria, circulation was dominated by the introduction of wind-driven alongshore flow from the external continental shelf through the ria entrances and its interaction with the topography. In the middle ria, circulation was primarily related to the upwelling/downwelling cycle, with a cool, salty and dense lower layer penetrating to the inner ria during upwelling over the shelf. A warmer, lower salinity and less dense surface layer of coastal waters flowed inward during downwelling. Without external forcing, the inner ria responded primarily to tides and buoyancy changes related to land runoff. Under both upwelling and downwelling conditions, the flushing of the ria involved shelf responses to wind pulses. Their persistence for a few days was sufficient to allow waters from the continental shelf to penetrate the innermost ria. Longer term observations supported by numerical modeling are required to confirm the generality of such flushing events in the ria and determine their typical frequency, while comparative studies should explore how these scenarios fit into the range of conditions experienced in other semi-enclosed bays.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Thermocouples are one of the most popular devices for temperature measurement due to their robustness, ease of manufacture and installation, and low cost. However, when used in certain harsh environments, for example, in combustion systems and engine exhausts, large wire diameters are required, and consequently the measurement bandwidth is reduced. This article discusses a software compensation technique to address the loss of high frequency fluctuations based on measurements from two thermocouples. In particular, a difference equation sDEd approach is proposed and compared with existing methods both in simulation and on experimental test rig data with constant flow velocity. It is found that the DE algorithm, combined with the use of generalized total least squares for parameter identification, provides better performance in terms of time constant estimation without any a priori assumption on the time constant ratios of the thermocouples.