914 resultados para Application of Data-driven Modelling in Water Sciences
Resumo:
The dinuclear complex [{Ru(CN)4}2(μ-bppz)]4− shows a strongly solvent-dependent metal–metal electronic interaction which allows the mixed-valence state to be switched from class 2 to class 3 by changing solvent from water to CH2Cl2. In CH2Cl2 the separation between the successive Ru(II)/Ru(III) redox couples is 350 mVand the IVCT band (from the UV/Vis/NIR spectroelectrochemistry) is characteristic of a borderline class II/III or class III mixed valence state. In water, the redox separation is only 110 mVand the much broader IVCT transition is characteristic of a class II mixed-valence state. This is consistent with the observation that raising and lowering the energy of the d(π) orbitals in CH2Cl2 or water, respectively, will decrease or increase the energy gap to the LUMO of the bppz bridging ligand, which provides the delocalisation pathway via electron-transfer. IR spectroelectrochemistry could only be carried out successfully in CH2Cl2 and revealed class III mixed-valence behaviour on the fast IR timescale. In contrast to this, time-resolved IR spectroscopy showed that the MLCTexcited state, which is formulated as RuIII(bppz˙−)RuII and can therefore be considered as a mixed-valence Ru(II)/Ru(III) complex with an intermediate bridging radical anion ligand, is localised on the IR timescale with spectroscopically distinct Ru(II) and Ru(III) termini. This is because the necessary electron-transfer via the bppz ligand is more difficult because of the additional electron on bppz˙− which raises the orbital through which electron exchange occurs in energy. DFT calculations reproduce the electronic spectra of the complex in all three Ru(II)/Ru(II), Ru(II)/Ru(III) and Ru(III)/Ru(III) calculations in both water and CH2Cl2 well as long as an explicit allowance is made for the presence of water molecules hydrogen-bonded to the cyanides in the model used. They also reproduce the excited-state IR spectra of both [Ru(CN)4(μ-bppz)]2– and [{Ru(CN)4}2(μ-bppz)]4− very well in both solvents. The reorganization of the water solvent shell indicates a possible dynamical reason for the longer life time of the triplet state in water compared to CH2Cl2.
Resumo:
Fundação de Amparo à Pesquisa do Estado de São Paulo (FAPESP)
Resumo:
A new approach is proposed in this work for the treatment of boundary value problems through the Adomian's decomposition method. Although frequently claimed as accurate and having fast convergence rates, the original formulation of Adomian's method does not allow the treatment of homogeneous boundary conditions along closed boundaries. The technique here presented overcomes this difficulty, and is applied to the analysis of magnetohydrodynamic duct flows. Results are in good agreement with finite element method calculations and analytical solutions for square ducts. Therefore, new possibilities appear for the application of Adomian's method in electromagnetics.
Resumo:
Includes bibliography
Resumo:
The xylanase biosynthesis is induced by its substrate-xylan. The high xylan content in some wastes such as wheat residues (wheat bran and wheat straw) makes them accessible and cheap sources of inducers to be mainly applied in great volumes of fermentation, such as those of industrial bioreactors. Thus, in this work, the main proposal was incorporated in the nutrient medium wheat straw particles decomposed to soluble compounds (liquor) through treatment of lignocellulosic materials in autohydrolysis process, as a strategy to increase and undervalue xylanase production by Aspergillus ochraceus. The wheat straw autohydrolysis liquor produced in several conditions was used as a sole carbon source or with wheat bran. The best conditions for xylanase and beta-xylosidase production were observed when A. ochraceus was cultivated with 1% wheat bran added of 10% wheat straw liquor (produced after 15 min of hydrothermal treatment) as carbon source. This substrate was more favorable when compared with xylan, wheat bran, and wheat straw autohydrolysis liquor used separately. The application of this substrate mixture in a stirred tank bioreactor indicated the possibility of scaling up the process to commercial production.
Resumo:
OBJECTIVE: To evaluate the ease of application of two-piece, graduated, compression systems for the treatment of venous ulcers. METHODS: Four kits used to provide limb compression in the management of venous ulcers were evaluated. These have been proven to be non-inferior to various types of bandages in clinical trials. The interface pressure exerted above the ankle by the under-stocking and the complete compression system and the force required to pull the over-stocking off were assessed in vitro. Ease of application of the four kits was evaluated in four sessions by five nurses who put stockings on their own legs in a blinded manner. They expressed their assessment of the stockings using a series of visual analogue scales (VASs). RESULTS: The Sigvaris Ulcer X((R)) kit provided a mean interface pressure of 46 mmHg and required a force in the range of 60-90 N to remove it. The Mediven((R)) ulcer kit exerted the same pressure but required force in the range of 150-190 N to remove it. Two kits (SurePress((R)) Comfort and VenoTrain((R)) Ulcertec) exerted a mean pressure of only 25 mmHg and needed a force in the range of 100-160 N to remove them. Nurses judged the Ulcer X and SurePress kits easiest to apply. Application of the VenoTrain kit was found slightly more difficult. The Mediven kit was judged to be difficult to use. CONCLUSIONS: Comparison of ease of application of compression-stocking kits in normal legs revealed marked differences between them. Only one system exerted a high pressure and was easy to apply. Direct comparison of these compression kits in leg-ulcer patients is required to assess whether our laboratory findings correlate with patient compliance and ulcer healing.
Resumo:
Leptospiral pulmonary haemorrhage syndrome (LPHS) is a particularly severe form of leptospirosis. LPHS is increasingly recognized in both humans and animals and is characterized by rapidly progressive intra-alveolar haemorrhage leading to high mortality. The pathogenic mechanisms of LPHS are poorly understood which hampers the application of effective treatment regimes. In this study a 2-D guinea pig proteome lung map was created and used to investigate the pathogenic mechanisms of LPHS. Comparison of lung proteomes from infected and non-infected guinea pigs via differential in-gel electrophoresis revealed highly significant differences in abundance of proteins contained in 130 spots. Acute phase proteins were the largest functional group amongst proteins with increased abundance in LPHS lung tissue, and likely reflect a local and/or systemic host response to infection. The observed decrease in abundance of proteins involved in cytoskeletal and cellular organization in LPHS lung tissue further suggests that infection with pathogenic Leptospira induces changes in the abundance of host proteins involved in cellular architecture and adhesion contributing to the dramatically increased alveolar septal wall permeability seen in LPHS. BIOLOGICAL SIGNIFICANCE The recent completion of the complete genome sequence of the guinea pig (Cavia porcellus) provides innovative opportunities to apply proteomic technologies to an important animal model of disease. In this study, the comparative proteomic analysis of lung tissue from experimentally infected guinea pigs with leptospiral pulmonary haemorrhage syndrome (LPHS) revealed a decrease in abundance of proteins involved in cellular architecture and adhesion, suggesting that loss or down-regulation of cytoskeletal and adhesion molecules plays an important role in the pathogenesis of LPHS. A publically available guinea pig lung proteome map was constructed to facilitate future pulmonary proteomics in this species.
Resumo:
The objective of this survey was to determine herd level risk factors for mortality, unwanted early slaughter, and metaphylactic application of antimicrobial group therapy in Swiss veal calves in 2013. A questionnaire regarding farm structure, farm management, mortality and antimicrobial use was sent to all farmers registered in a Swiss label program setting requirements for improved animal welfare and sustainability. Risk factors were determined by multivariable logistic regression. A total of 619 veal producers returned a useable questionnaire (response rate=28.5%), of which 40.9% only fattened their own calves (group O), 56.9% their own calves and additional purchased calves (group O&P), and 2.3% only purchased calves for fattening (group P). A total number of 19,077 calves entered the fattening units in 2013, of which 21.7%, 66.7%, and 11.6% belonged to groups O, O&P, and P, respectively. Mortality was 0% in 322 herds (52.0%), between 0% and 3% in 47 herds (7.6%), and ≥3% in 250 herds (40.4%). Significant risk factors for mortality were purchasing calves, herd size, higher incidence of BRD, and access to an outside pen. Metaphylaxis was used on 13.4% of the farms (7.9% only upon arrival, 4.4% only later in the fattening period, 1.1% upon arrival and later), in 3.2% of the herds of group O, 17.9% of those in group O&P, and 92.9% of those of group P. Application of metaphylaxis upon arrival was positively associated with purchase (OR=8.9) and herd size (OR=1.2 per 10 calves). Metaphylaxis later in the production cycle was positively associated with group size (OR=2.9) and risk of respiratory disease (OR=1.2 per 10% higher risk) and negatively with the use of individual antimicrobial treatment (OR=0.3). In many countries, purchase and a large herd size are inherently connected to veal production. The Swiss situation with large commercial but also smaller herds with little or no purchase of calves made it possible to investigate the effect of these factors on mortality and antimicrobial drug use. The results of this study show that a system where small farms raise the calves from their own herds has a substantial potential to improve animal health and reduce antimicrobial drug use.
Resumo:
The flux of foreign investment into the water industry led to the internationalisation of contracts and of the method of settlement of possible disputes. When disputes over the performance of a water concession give origin to investor-state arbitrations, public authorities are put in a challenging position. The state need to combine two different roles – its role in the provision of services of public interest and the fulfilment of its international legal obligations arising from international investment agreements. The complexity of this relationship is patent in a variety of procedural and substantive issues that have been surfacing in arbitration proceedings conducted before the International Centre for Settlement of Investment Disputes. The purpose of this dissertation is to discuss the impact of investment arbitration on the protection of public interests associated with water services. In deciding these cases arbitrators are contributing significantly in shaping the contours and substance of an emerging international economic water services regime. Through the looking glass of arbitration awards one can realise the substantial consequences that the international investment regime has been producing on water markets and how significantly it has been impacting the public interests associated with water services. Due consideration of the public interests in water concession disputes requires concerted action in two different domains: changing the investment arbitration mechanism, by promoting the transparency of proceedings and the participation of non-parties; and changing the regulatory framework that underpins investments in water services. Combined, these improvements are likely to infuse public interests into water concession arbitrations.
Resumo:
This thesis studies how commercial practice is developing with artificial intelligence (AI) technologies and discusses some normative concepts in EU consumer law. The author analyses the phenomenon of 'algorithmic business', which defines the increasing use of data-driven AI in marketing organisations for the optimisation of a range of consumer-related tasks. The phenomenon is orienting business-consumer relations towards some general trends that influence power and behaviors of consumers. These developments are not taking place in a legal vacuum, but against the background of a normative system aimed at maintaining fairness and balance in market transactions. The author assesses current developments in commercial practices in the context of EU consumer law, which is specifically aimed at regulating commercial practices. The analysis is critical by design and without neglecting concrete practices tries to look at the big picture. The thesis consists of nine chapters divided in three thematic parts. The first part discusses the deployment of AI in marketing organisations, a brief history, the technical foundations, and their modes of integration in business organisations. In the second part, a selected number of socio-technical developments in commercial practice are analysed. The following are addressed: the monitoring and analysis of consumers’ behaviour based on data; the personalisation of commercial offers and customer experience; the use of information on consumers’ psychology and emotions, the mediation through marketing conversational applications. The third part assesses these developments in the context of EU consumer law and of the broader policy debate concerning consumer protection in the algorithmic society. In particular, two normative concepts underlying the EU fairness standard are analysed: manipulation, as a substantive regulatory standard that limits commercial behaviours in order to protect consumers’ informed and free choices and vulnerability, as a concept of social policy that portrays people who are more exposed to marketing practices.
Resumo:
This study aims to analyze concepts and practices developed by nurses in occupational health in primary care, and it is justified by the need to expand knowledge of this thematic area. This is an analytical qualitative study carried out in primary care units of health districts of the city of Natal-RN, in one health unit in each neighborhood. Data collection was held from August to October 2014, through semistructured interviews, in the following order: Selection of respondents and scheduling of interview; interviews and application of data collection instrument in order to trace socio-demographic profile of the target population; transcription of interviews; categorization of information and analysis in light of hermeneutic-dialectic. The concept of Occupational Health reported by subjects investigated, although simplified with respect to specificities of workers, was revealed with a wide dimension, with perspective of workers’ approach in their physical, mental and social context, suggesting a good seizure according to the expanded concept of health. Furthermore, it was possible to affirm the recognition of an incipient performance of primary care nurses on Occupational Health, whose performance was appointed as defective. In general, some specific actions of Occupational Health, carried out in health facilities, were cited. Other activities showed up to be routine, being held by a minority of professionals aware of the importance and need to reach these users, in order to engage them in the routine of the health unit. Most professionals reported not having approached Occupational Health during undergraduate nursing, highlighting a lack in theoretical and practical aspects of the area
Resumo:
Performance modelling is a useful tool in the lifeycle of high performance scientific software, such as weather and climate models, especially as a means of ensuring efficient use of available computing resources. In particular, sufficiently accurate performance prediction could reduce the effort and experimental computer time required when porting and optimising a climate model to a new machine. In this paper, traditional techniques are used to predict the computation time of a simple shallow water model which is illustrative of the computation (and communication) involved in climate models. These models are compared with real execution data gathered on AMD Opteron-based systems, including several phases of the U.K. academic community HPC resource, HECToR. Some success is had in relating source code to achieved performance for the K10 series of Opterons, but the method is found to be inadequate for the next-generation Interlagos processor. The experience leads to the investigation of a data-driven application benchmarking approach to performance modelling. Results for an early version of the approach are presented using the shallow model as an example.
Resumo:
La biologie de la conservation est communément associée à la protection de petites populations menacées d?extinction. Pourtant, il peut également être nécessaire de soumettre à gestion des populations surabondantes ou susceptibles d?une trop grande expansion, dans le but de prévenir les effets néfastes de la surpopulation. Du fait des différences tant quantitatives que qualitatives entre protection des petites populations et contrôle des grandes, il est nécessaire de disposer de modèles et de méthodes distinctes. L?objectif de ce travail a été de développer des modèles prédictifs de la dynamique des grandes populations, ainsi que des logiciels permettant de calculer les paramètres de ces modèles et de tester des scénarios de gestion. Le cas du Bouquetin des Alpes (Capra ibex ibex) - en forte expansion en Suisse depuis sa réintroduction au début du XXème siècle - servit d?exemple. Cette tâche fut accomplie en trois étapes : En premier lieu, un modèle de dynamique locale, spécifique au Bouquetin, fut développé : le modèle sous-jacent - structuré en classes d?âge et de sexe - est basé sur une matrice de Leslie à laquelle ont été ajoutées la densité-dépendance, la stochasticité environnementale et la chasse de régulation. Ce modèle fut implémenté dans un logiciel d?aide à la gestion - nommé SIM-Ibex - permettant la maintenance de données de recensements, l?estimation automatisée des paramètres, ainsi que l?ajustement et la simulation de stratégies de régulation. Mais la dynamique d?une population est influencée non seulement par des facteurs démographiques, mais aussi par la dispersion et la colonisation de nouveaux espaces. Il est donc nécessaire de pouvoir modéliser tant la qualité de l?habitat que les obstacles à la dispersion. Une collection de logiciels - nommée Biomapper - fut donc développée. Son module central est basé sur l?Analyse Factorielle de la Niche Ecologique (ENFA) dont le principe est de calculer des facteurs de marginalité et de spécialisation de la niche écologique à partir de prédicteurs environnementaux et de données d?observation de l?espèce. Tous les modules de Biomapper sont liés aux Systèmes d?Information Géographiques (SIG) ; ils couvrent toutes les opérations d?importation des données, préparation des prédicteurs, ENFA et calcul de la carte de qualité d?habitat, validation et traitement des résultats ; un module permet également de cartographier les barrières et les corridors de dispersion. Le domaine d?application de l?ENFA fut exploré par le biais d?une distribution d?espèce virtuelle. La comparaison à une méthode couramment utilisée pour construire des cartes de qualité d?habitat, le Modèle Linéaire Généralisé (GLM), montra qu?elle était particulièrement adaptée pour les espèces cryptiques ou en cours d?expansion. Les informations sur la démographie et le paysage furent finalement fusionnées en un modèle global. Une approche basée sur un automate cellulaire fut choisie, tant pour satisfaire aux contraintes du réalisme de la modélisation du paysage qu?à celles imposées par les grandes populations : la zone d?étude est modélisée par un pavage de cellules hexagonales, chacune caractérisée par des propriétés - une capacité de soutien et six taux d?imperméabilité quantifiant les échanges entre cellules adjacentes - et une variable, la densité de la population. Cette dernière varie en fonction de la reproduction et de la survie locale, ainsi que de la dispersion, sous l?influence de la densité-dépendance et de la stochasticité. Un logiciel - nommé HexaSpace - fut développé pour accomplir deux fonctions : 1° Calibrer l?automate sur la base de modèles de dynamique (par ex. calculés par SIM-Ibex) et d?une carte de qualité d?habitat (par ex. calculée par Biomapper). 2° Faire tourner des simulations. Il permet d?étudier l?expansion d?une espèce envahisseuse dans un paysage complexe composé de zones de qualité diverses et comportant des obstacles à la dispersion. Ce modèle fut appliqué à l?histoire de la réintroduction du Bouquetin dans les Alpes bernoises (Suisse). SIM-Ibex est actuellement utilisé par les gestionnaires de la faune et par les inspecteurs du gouvernement pour préparer et contrôler les plans de tir. Biomapper a été appliqué à plusieurs espèces (tant végétales qu?animales) à travers le Monde. De même, même si HexaSpace fut initialement conçu pour des espèces animales terrestres, il pourrait aisément être étndu à la propagation de plantes ou à la dispersion d?animaux volants. Ces logiciels étant conçus pour, à partir de données brutes, construire un modèle réaliste complexe, et du fait qu?ils sont dotés d?une interface d?utilisation intuitive, ils sont susceptibles de nombreuses applications en biologie de la conservation. En outre, ces approches peuvent également s?appliquer à des questions théoriques dans les domaines de l?écologie des populations et du paysage.<br/><br/>Conservation biology is commonly associated to small and endangered population protection. Nevertheless, large or potentially large populations may also need human management to prevent negative effects of overpopulation. As there are both qualitative and quantitative differences between small population protection and large population controlling, distinct methods and models are needed. The aim of this work was to develop theoretical models to predict large population dynamics, as well as computer tools to assess the parameters of these models and to test management scenarios. The alpine Ibex (Capra ibex ibex) - which experienced a spectacular increase since its reintroduction in Switzerland at the beginning of the 20th century - was used as paradigm species. This task was achieved in three steps: A local population dynamics model was first developed specifically for Ibex: the underlying age- and sex-structured model is based on a Leslie matrix approach with addition of density-dependence, environmental stochasticity and culling. This model was implemented into a management-support software - named SIM-Ibex - allowing census data maintenance, parameter automated assessment and culling strategies tuning and simulating. However population dynamics is driven not only by demographic factors, but also by dispersal and colonisation of new areas. Habitat suitability and obstacles modelling had therefore to be addressed. Thus, a software package - named Biomapper - was developed. Its central module is based on the Ecological Niche Factor Analysis (ENFA) whose principle is to compute niche marginality and specialisation factors from a set of environmental predictors and species presence data. All Biomapper modules are linked to Geographic Information Systems (GIS); they cover all operations of data importation, predictor preparation, ENFA and habitat suitability map computation, results validation and further processing; a module also allows mapping of dispersal barriers and corridors. ENFA application domain was then explored by means of a simulated species distribution. It was compared to a common habitat suitability assessing method, the Generalised Linear Model (GLM), and was proven better suited for spreading or cryptic species. Demography and landscape informations were finally merged into a global model. To cope with landscape realism and technical constraints of large population modelling, a cellular automaton approach was chosen: the study area is modelled by a lattice of hexagonal cells, each one characterised by a few fixed properties - a carrying capacity and six impermeability rates quantifying exchanges between adjacent cells - and one variable, population density. The later varies according to local reproduction/survival and dispersal dynamics, modified by density-dependence and stochasticity. A software - named HexaSpace - was developed, which achieves two functions: 1° Calibrating the automaton on the base of local population dynamics models (e.g., computed by SIM-Ibex) and a habitat suitability map (e.g. computed by Biomapper). 2° Running simulations. It allows studying the spreading of an invading species across a complex landscape made of variously suitable areas and dispersal barriers. This model was applied to the history of Ibex reintroduction in Bernese Alps (Switzerland). SIM-Ibex is now used by governmental wildlife managers to prepare and verify culling plans. Biomapper has been applied to several species (both plants and animals) all around the World. In the same way, whilst HexaSpace was originally designed for terrestrial animal species, it could be easily extended to model plant propagation or flying animals dispersal. As these softwares were designed to proceed from low-level data to build a complex realistic model and as they benefit from an intuitive user-interface, they may have many conservation applications. Moreover, theoretical questions in the fields of population and landscape ecology might also be addressed by these approaches.
Resumo:
As a result of urbanization, stormwater runoff flow rates and volumes are significantly increased due to increasing impervious land cover and the decreased availability of depression storage. Storage tanks are the basic devices to efficiently control the flow rate in drainage systems during wet weather. Presented in the paper conception of vacuum-driven detention tanks allows to increase the storage capacity by usage of space above the free surface water elevation at the inlet channel. Partial vacuum storage makes possible to gain cost savings by reduction of both the horizontal area of the detention tank and necessary depth of foundations. Simulation model of vacuum-driven storage tank has been developed to estimate potential profits of its application in urban drainage system. Although SWMM5 has no direct options for vacuum tanks an existing functions (i.e. control rules) have been used to reflect its operation phases. Rainfall data used in simulations were recorded at raingage in Czestochowa during years 2010÷2012 with time interval of 10minutes. Simulation results gives overview to practical operation and maintenance cost (energy demand) of vacuum driven storage tanks depending of the ratio: vacuum-driven volume to total storage capacity. The following conclusion can be drawn from this investigations: vacuum-driven storage tanks are characterized by uncomplicated construction and control systems, thus can be applied in newly developed as well as in the existing urban drainage systems. the application of vacuum in underground detention facilities makes possible to increase of the storage capacity of existing reservoirs by usage the space above the maximum depth. Possible increase of storage capacity can achieve even a few dozen percent at relatively low investment costs. vacuum driven storage tanks can be included in existing simulation software (i.e. SWMM) using options intended for pumping stations (including control and action rules ).
Resumo:
In this study, an effective microbial consortium for the biodegradation of phenol was grown under different operational conditions, and the effects of phosphate concentration (1.4 g L-1, 2.8 g L-1, 4.2 g L-1), temperature (25 degrees C, 30 degrees C, 35 degrees C), agitation (150 rpm, 200 rpm, 250 rpm) and pH (6, 7, 8) on phenol degradation were investigated, whereupon an artificial neural network (ANN) model was developed in order to predict degradation. The learning, recall and generalization characteristics of neural networks were studied using data from the phenol degradation system. The efficiency of the model generated by the ANN was then tested and compared with the experimental results obtained. In both cases, the results corroborate the idea that aeration and temperature are crucial to increasing the efficiency of biodegradation.