993 resultados para runoff generation processes
Resumo:
Food restriction has a great impact on skeletal muscle mass by inducing muscle protein breakdown to provide substrates for energy production through gluconeogenesis. Genetic models of hyper-muscularity interfere with the normal balance between protein synthesis and breakdown which eventually results in extreme muscle growth. Mutations or deletions in the myostatin gene result in extreme muscle mass. Here we evaluated the impact of food restriction for a period of 5 weeks on skeletal muscle size (i.e., fibre cross-sectional area), fibre type composition and contractile properties (i.e., tetanic and specific force) in myostatin null mice. We found that this hyper-muscular model was more susceptible to catabolic processes than wild type mice. The mechanism of skeletal muscle mass loss was examined and our data shows that the myostatin null mice placed on a low calorie diet maintained the activity of molecules involved in protein synthesis and did not up-regulate the expression of genes pivotal in ubiquitin-mediated protein degradation. However, we did find an increase in the expression of genes associated with autophagy. Surprisingly, the reduction on muscle size was followed by improved tetanic and specific force in the null mice compared to wild type mice. These data provide evidence that food restriction may revert the hyper-muscular phenotype of the myostatin null mouse restoring muscle function.
Resumo:
Models of root system growth emerged in the early 1970s, and were based on mathematical representations of root length distribution in soil. The last decade has seen the development of more complex architectural models and the use of computer-intensive approaches to study developmental and environmental processes in greater detail. There is a pressing need for predictive technologies that can integrate root system knowledge, scaling from molecular to ensembles of plants. This paper makes the case for more widespread use of simpler models of root systems based on continuous descriptions of their structure. A new theoretical framework is presented that describes the dynamics of root density distributions as a function of individual root developmental parameters such as rates of lateral root initiation, elongation, mortality, and gravitropsm. The simulations resulting from such equations can be performed most efficiently in discretized domains that deform as a result of growth, and that can be used to model the growth of many interacting root systems. The modelling principles described help to bridge the gap between continuum and architectural approaches, and enhance our understanding of the spatial development of root systems. Our simulations suggest that root systems develop in travelling wave patterns of meristems, revealing order in otherwise spatially complex and heterogeneous systems. Such knowledge should assist physiologists and geneticists to appreciate how meristem dynamics contribute to the pattern of growth and functioning of root systems in the field.
Resumo:
The impending threat of global climate change and its regional manifestations is among the most important and urgent problems facing humanity. Society needs accurate and reliable estimates of changes in the probability of regional weather variations to develop science-based adaptation and mitigation strategies. Recent advances in weather prediction and in our understanding and ability to model the climate system suggest that it is both necessary and possible to revolutionize climate prediction to meet these societal needs. However, the scientific workforce and the computational capability required to bring about such a revolution is not available in any single nation. Motivated by the success of internationally funded infrastructure in other areas of science, this paper argues that, because of the complexity of the climate system, and because the regional manifestations of climate change are mainly through changes in the statistics of regional weather variations, the scientific and computational requirements to predict its behavior reliably are so enormous that the nations of the world should create a small number of multinational high-performance computing facilities dedicated to the grand challenges of developing the capabilities to predict climate variability and change on both global and regional scales over the coming decades. Such facilities will play a key role in the development of next-generation climate models, build global capacity in climate research, nurture a highly trained workforce, and engage the global user community, policy-makers, and stakeholders. We recommend the creation of a small number of multinational facilities with computer capability at each facility of about 20 peta-flops in the near term, about 200 petaflops within five years, and 1 exaflop by the end of the next decade. Each facility should have sufficient scientific workforce to develop and maintain the software and data analysis infrastructure. Such facilities will enable questions of what resolution, both horizontal and vertical, in atmospheric and ocean models, is necessary for more confident predictions at the regional and local level. Current limitations in computing power have placed severe limitations on such an investigation, which is now badly needed. These facilities will also provide the world's scientists with the computational laboratories for fundamental research on weather–climate interactions using 1-km resolution models and on atmospheric, terrestrial, cryospheric, and oceanic processes at even finer scales. Each facility should have enabling infrastructure including hardware, software, and data analysis support, and scientific capacity to interact with the national centers and other visitors. This will accelerate our understanding of how the climate system works and how to model it. It will ultimately enable the climate community to provide society with climate predictions, which are based on our best knowledge of science and the most advanced technology.
Resumo:
Purpose: Increasing costs of health care, fuelled by demand for high quality, cost-effective healthcare has drove hospitals to streamline their patient care delivery systems. One such systematic approach is the adaptation of Clinical Pathways (CP) as a tool to increase the quality of healthcare delivery. However, most organizations still rely on are paper-based pathway guidelines or specifications, which have limitations in process management and as a result can influence patient safety outcomes. In this paper, we present a method for generating clinical pathways based on organizational semiotics by capturing knowledge from syntactic, semantic and pragmatic to social level. Design/methodology/approach: The proposed modeling approach to generation of CPs adopts organizational semiotics and enables the generation of semantically rich representation of CP knowledge. Semantic Analysis Method (SAM) is applied to explicitly represent the semantics of the concepts, their relationships and patterns of behavior in terms of an ontology chart. Norm Analysis Method (NAM) is adopted to identify and formally specify patterns of behavior and rules that govern the actions identified on the ontology chart. Information collected during semantic and norm analysis is integrated to guide the generation of CPs using best practice represented in BPMN thus enabling the automation of CP. Findings: This research confirms the necessity of taking into consideration social aspects in designing information systems and automating CP. The complexity of healthcare processes can be best tackled by analyzing stakeholders, which we treat as social agents, their goals and patterns of action within the agent network. Originality/value: The current modeling methods describe CPs from a structural aspect comprising activities, properties and interrelationships. However, these methods lack a mechanism to describe possible patterns of human behavior and the conditions under which the behavior will occur. To overcome this weakness, a semiotic approach to generation of clinical pathway is introduced. The CP generated from SAM together with norms will enrich the knowledge representation of the domain through ontology modeling, which allows the recognition of human responsibilities and obligations and more importantly, the ultimate power of decision making in exceptional circumstances.
Resumo:
The goal of the Chemistry‐Climate Model Validation (CCMVal) activity is to improve understanding of chemistry‐climate models (CCMs) through process‐oriented evaluation and to provide reliable projections of stratospheric ozone and its impact on climate. An appreciation of the details of model formulations is essential for understanding how models respond to the changing external forcings of greenhouse gases and ozonedepleting substances, and hence for understanding the ozone and climate forecasts produced by the models participating in this activity. Here we introduce and review the models used for the second round (CCMVal‐2) of this intercomparison, regarding the implementation of chemical, transport, radiative, and dynamical processes in these models. In particular, we review the advantages and problems associated with approaches used to model processes of relevance to stratospheric dynamics and chemistry. Furthermore, we state the definitions of the reference simulations performed, and describe the forcing data used in these simulations. We identify some developments in chemistry‐climate modeling that make models more physically based or more comprehensive, including the introduction of an interactive ocean, online photolysis, troposphere‐stratosphere chemistry, and non‐orographic gravity‐wave deposition as linked to tropospheric convection. The relatively new developments indicate that stratospheric CCM modeling is becoming more consistent with our physically based understanding of the atmosphere.
Resumo:
This chapter presents techniques used for the generation of 3D digital elevation models (DEMs) from remotely sensed data. Three methods are explored and discussed—optical stereoscopic imagery, Interferometric Synthetic Aperture Radar (InSAR), and LIght Detection and Ranging (LIDAR). For each approach, the state-of-the-art presented in the literature is reviewed. Techniques involved in DEM generation are presented with accuracy evaluation. Results of DEMs reconstructed from remotely sensed data are illustrated. While the processes of DEM generation from satellite stereoscopic imagery represents a good example of passive, multi-view imaging technology, discussed in Chap. 2 of this book, InSAR and LIDAR use different principles to acquire 3D information. With regard to InSAR and LIDAR, detailed discussions are conducted in order to convey the fundamentals of both technologies.
Resumo:
The Canadian Middle Atmosphere Modelling (MAM) project is a collaboration between thé Atmospheric Environment Service (AES) of Environment Canada and several Canadian universities. Its goal is thé development of a comprehensive General Circulation Model of the troposphere-stratosphere-mesosphere System, starting from the AES/CCCma third-generation atmospheric General Circulation Model. This paper describes the basic features of the first-generation Canadian MAM and some aspects of its radiative-dynamical climatology. Standard first-order mean diagnostics are presented for monthly means and for the annual cycle of zonal-mean winds and temperatures. The mean meridional circulation is examined, and comparison is made between thé steady diabatic, downward controlled, and residual stream functions. It is found that downward control holds quite well in the monthly mean through most of the middle atmosphere, even during equinoctal periods. The relative roles of different drag processes in determining the mean downwelling over the wintertime polar middle stratosphere is examined, and the vertical structure of the drag is quantified.
Resumo:
Advances in the science and observation of climate change are providing a clearer understanding of the inherent variability of Earth’s climate system and its likely response to human and natural influences. The implications of climate change for the environment and society will depend not only on the response of the Earth system to changes in radiative forcings, but also on how humankind responds through changes in technology, economies, lifestyle and policy. Extensive uncertainties exist in future forcings of and responses to climate change, necessitating the use of scenarios of the future to explore the potential consequences of different response options. To date, such scenarios have not adequately examined crucial possibilities, such as climate change mitigation and adaptation, and have relied on research processes that slowed the exchange of information among physical, biological and social scientists. Here we describe a new process for creating plausible scenarios to investigate some of the most challenging and important questions about climate change confronting the global community
Resumo:
Aimed at reducing deficiencies in representing the Madden-Julian oscillation (MJO) in general circulation models (GCMs), a global model evaluation project on vertical structure and physical processes of the MJO was coordinated. In this paper, results from the climate simulation component of this project are reported. It is shown that the MJO remains a great challenge in these latest generation GCMs. The systematic eastward propagation of the MJO is only well simulated in about one-fourth of the total participating models. The observed vertical westward tilt with altitude of the MJO is well simulated in good MJO models, but not in the poor ones. Damped Kelvin wave responses to the east of convection in the lower troposphere could be responsible for the missing MJO preconditioning process in these poor MJO models. Several process-oriented diagnostics were conducted to discriminate key processes for realistic MJO simulations. While large-scale rainfall partition and low-level mean zonal winds over the Indo-Pacific in a model are not found to be closely associated with its MJO skill, two metrics, including the low-level relative humidity difference between high and low rain events and seasonal mean gross moist stability, exhibit statistically significant correlations with the MJO performance. It is further indicated that increased cloud-radiative feedback tends to be associated with reduced amplitude of intraseasonal variability, which is incompatible with the radiative instability theory previously proposed for the MJO. Results in this study confirm that inclusion of air-sea interaction can lead to significant improvement in simulating the MJO.
Resumo:
By-products streams from a sunflower-based biodiesel plant were utilised for the production of fermentation media that can be used for the production of polyhydroxyalkanoates (PHA). Sunflower meal was utilised as substrate for the production of crude enzyme consortia through solid state fermentation (SSF) with the fungal strain Aspergillus oryzae. Fermented solids were subsequently mixed with unprocessed sunflower meal aiming at the production of a nutrient-rich fermentation feedstock. The highest free amino nitrogen (FAN) and inorganic phosphorus concentrations achieved were 1.5 g L-1 and 246 mg L-1, respectively, when an initial proteolytic activity of 6.4 U mL-1 was used. The FANconcentrationwas increased to 2.3 g L-1 when the initial proteolytic activity was increased to 16 U mL-1. Sunflower meal hydrolysates were mixed with crude glycerol to provide fermentationmedia that were evaluated for the production of poly(3-hydroxybutyrateco- 3-hydroxyvalerate) (P(3HB-co-3HV)) using Cupriavidus necator DSM545. The P(3HB-co-3HV) (9.9 g l-1) produced contained 3HB and 3HV units with 97 and 3 mol %, respectively. Integrating PHA production in existing 1st generation biodiesel production plants through valorisation of by-product streams could improve their sustainability.
Resumo:
The rapid development of biodiesel production technology has led to the generation of tremendous quantities of glycerol wastes, as the main by-product of the process. Stoichiometrically, it has been calculated that for every 100 kg of biodiesel, 10 kg of glycerol are produced. Based on the technology imposed by various biodiesel plants, glycerol wastes may contain numerous kinds of impurities such as methanol, salts, soaps, heavy metals and residual fatty acids. This fact often renders biodiesel-derived glycerol unprofitable for further purification. Therefore, the utilization of crude glycerol though biotechnological means represents a promising alternative for the effective management of this industrial waste. This review summarizes the effect of various impurities-contaminants that are found in biodiesel-derived crude glycerol upon its conversion by microbial strains in biotechnological processes. Insights are given concerning the technologies that are currently applied in biodiesel production, with emphasis to the impurities that are added in the composition of crude glycerol, through each step of the production process. Moreover, extensive discussion is made in relation with the impact of the nature of impurities upon the performances of prokaryotic and eukaryotic microorganisms, during crude glycerol bioconversions into a variety of high added-value metabolic products. Finally, aspects concerning ways of crude glycerol treatment for the removal of inhibitory contaminants as reported in the literature are given and comprehensively discussed
Resumo:
A few years ago, it was reported that ozone is produced in human atherosclerotic arteries, on the basis of the identification of 3 beta-hydroxy-5-oxo-5,6-secocholestan-6-al and 3 beta-hydroxy-5 beta-hydroxy-B-norcholestane-6 beta-carboxaldehyde (ChAld) as their 2,4-dinitrophenylhydrazones. The formation of endogenous ozone was attributed to water oxidation catalyzed by antibodies, with the formation of dihydrogen trioxide as a key intermediate. We now report that ChAld is also generated by the reaction of cholesterol with singlet molecular oxygen [O(2) ((1)Delta(g))] that is produced by photodynamic action or by the thermodecomposition of 1,4-dimethylnaphthalene endoperoxide, a defined pure chemical source of O(2) ((1)Delta(g)). On the basis of (18)O-labeled ChAld mass spectrometry, NMR, light emission measurements, and derivatization studies, we propose that the mechanism of ChAld generation involves the formation of the well-known cholesterol 5 alpha-hydroperoxide (5 alpha-OOH) (the major product of O(2) ((1)Delta(g))-oxidation of cholesterol) and/or a 1,2-dioxetane intermediate formed by O(2) ((1)Delta(g)) attack at the Delta(5) position. The Hock cleavage of 5 alpha-OOH (the major pathway) or unstable cholesterol dioxetane decomposition (a minor pathway, traces) gives a 5,6-secosterol intermediate, which undergoes intramolecular aldolization to yield ChAld. These results show clearly and unequivocally that ChAld is generated upon the reaction of cholesterol with O(2) ((1)Delta(g)) and raises questions about the role of ozone in biological processes.
Resumo:
Distributed energy and water balance models require time-series surfaces of the meteorological variables involved in hydrological processes. Most of the hydrological GIS-based models apply simple interpolation techniques to extrapolate the point scale values registered at weather stations at a watershed scale. In mountainous areas, where the monitoring network ineffectively covers the complex terrain heterogeneity, simple geostatistical methods for spatial interpolation are not always representative enough, and algorithms that explicitly or implicitly account for the features creating strong local gradients in the meteorological variables must be applied. Originally developed as a meteorological pre-processing tool for a complete hydrological model (WiMMed), MeteoMap has become an independent software. The individual interpolation algorithms used to approximate the spatial distribution of each meteorological variable were carefully selected taking into account both, the specific variable being mapped, and the common lack of input data from Mediterranean mountainous areas. They include corrections with height for both rainfall and temperature (Herrero et al., 2007), and topographic corrections for solar radiation (Aguilar et al., 2010). MeteoMap is a GIS-based freeware upon registration. Input data include weather station records and topographic data and the output consists of tables and maps of the meteorological variables at hourly, daily, predefined rainfall event duration or annual scales. It offers its own pre and post-processing tools, including video outlook, map printing and the possibility of exporting the maps to images or ASCII ArcGIS formats. This study presents the friendly user interface of the software and shows some case studies with applications to hydrological modeling.
Resumo:
Nowadays, more than half of the computer development projects fail to meet the final users' expectations. One of the main causes is insufficient knowledge about the organization of the enterprise to be supported by the respective information system. The DEMO methodology (Design and Engineering Methodology for Organizations) has been proved as a well-defined method to specify, through models and diagrams, the essence of any organization at a high level of abstraction. However, this methodology is platform implementation independent, lacking the possibility of saving and propagating possible changes from the organization models to the implemented software, in a runtime environment. The Universal Enterprise Adaptive Object Model (UEAOM) is a conceptual schema being used as a basis for a wiki system, to allow the modeling of any organization, independent of its implementation, as well as the previously mentioned change propagation in a runtime environment. Based on DEMO and UEAOM, this project aims to develop efficient and standardized methods, to enable an automatic conversion of DEMO Ontological Models, based on UEAOM specification into BPMN (Business Process Model and Notation) models of processes, using clear semantics, without ambiguities, in order to facilitate the creation of processes, almost ready for being executed on workflow systems that support BPMN.
Resumo:
We investigate the potential of TESLA and JLC/NLC electron-positron linear collider designs to observe diquarks produced resonantly in processes involving hard photons.