790 resultados para multi-language environment
Resumo:
This paper presents the development of a procedure, which enables the analysis of nine pharmaceutical drugs in wastewater using gas chromatography-mass spectrometry (GC-MS) associated with solid-phase microextraction (SPME) for the sample preparation. Experimental design was applied to optimize the in situ derivatization and the SPME extraction conditions. Ethyl chloroformate (ECF) was employed as derivatizing agent and polydimethylsiloxane-divinylbenzene (PDMS-DVB) as the SPME fiber coating. A fractional factorial design was used to evaluate the main factors for the in situ derivatization and SPME extraction. Thereafter, a Doehlert matrix design was applied to find out the best experimental conditions. The method presented a linear range from 0.5 to 10 mu g/L, and the intraday and interday precision were lower than 16%. Applicability of the method was verified from real influent and effluent samples of a wastewater treatment plant, as well as from samples of an industry wastewater and a river.
Resumo:
Information flows are formed naturally or formally induced in organizational settings, passing from the strategic level to operational level, reflecting, and impacting in the processes that make up the organization, including the decision-making process and therefore the action strategies of organization. The management of organizational environments based on information requires careful attention to various kinds of languages used for communication between sectors and employees of the organization, whose goal is to share, disseminate and socialize the information produced in this environment.
Resumo:
The objective of the present work was to propose a method for testing the contribution of each level of the factors in a genotypes x environments (GxE) interaction using multi-environment trials analyses by means of an F test. The study evaluated a data set, with twenty genotypes and thirty-four environments, in a block design with four replications. The sum of squares within rows (genotypes) and columns (environments) of the GxE matrix was simulated, generating 10000 experiments to verify the empirical distribution. Results indicate a noncentral chi-square distribution for rows and columns of the GxE interaction matrix, which was also verified by the Kolmogorov-Smirnov test and Q-Q plot. Application of the F test identified the genotypes and environments that contributed the most to the GxE interaction. In this way, geneticists can select good genotypes in their studies.
Resumo:
Breakthrough advances in microprocessor technology and efficient power management have altered the course of development of processors with the emergence of multi-core processor technology, in order to bring higher level of processing. The utilization of many-core technology has boosted computing power provided by cluster of workstations or SMPs, providing large computational power at an affordable cost using solely commodity components. Different implementations of message-passing libraries and system softwares (including Operating Systems) are installed in such cluster and multi-cluster computing systems. In order to guarantee correct execution of message-passing parallel applications in a computing environment other than that originally the parallel application was developed, review of the application code is needed. In this paper, a hybrid communication interfacing strategy is proposed, to execute a parallel application in a group of computing nodes belonging to different clusters or multi-clusters (computing systems may be running different operating systems and MPI implementations), interconnected with public or private IP addresses, and responding interchangeably to user execution requests. Experimental results demonstrate the feasibility of this proposed strategy and its effectiveness, through the execution of benchmarking parallel applications.
Resumo:
Aims. We report on simultaneous observations and modeling of mid-infrared (MIR), near-infrared (NIR), and submillimeter (sub-mm) emission of the source Sgr A * associated with the supermassive black hole at the center of our Galaxy. Our goal was to monitor the activity of Sgr A* at different wavelengths in order to constrain the emitting processes and gain insight into the nature of the close environment of Sgr A*. Methods. We used the MIR instrument VISIR in the BURST imaging mode, the adaptive optics assisted NIR camera NACO, and the sub-mm antenna APEX to monitor Sgr A* over several nights in July 2007. Results. The observations reveal remarkable variability in the NIR and sub-mm during the five nights of observation. No source was detected in the MIR, but we derived the lowest upper limit for a flare at 8.59 mu m (22.4 mJy with A(8.59 mu m) = 1.6 +/- 0.5). This observational constraint makes us discard the observed NIR emission as coming from a thermal component emitting at sub-mm frequencies. Moreover, comparison of the sub-mm and NIR variability shows that the highest NIR fluxes (flares) are coincident with the lowest sub-mm levels of our five-night campaign involving three flares. We explain this behavior by a loss of electrons to the system and/or by a decrease in the magnetic field, as might conceivably occur in scenarios involving fast outflows and/or magnetic reconnection.
Resumo:
The sources and concentrations of aliphatic hydrocarbons (AHs) and polycyclic aromatic hydrocarbons (PAHs), faecal and biogenic sterols, and trace metals at 10 sampling sites located in Laranjeiras Bay, a large Environmental Protection Area in the southern Atlantic region of Brazil, were determined to assess the sources of organic matter and the contamination status of estuarine sediments. Organic compounds were determined by GC-FID and GC-MS, and ICP-OES was used to evaluate trace metals. The total AHs concentration ranged from 0.28 to 8.19 mu g g(-1), and n-C-29 and n-C-31 alkanes were predominant, indicating significant inputs from higher terrestrial plants. Unresolved complex mixtures (UCM) were not detected at any site, suggesting that the study area was not significantly contaminated by fossil fuels. The total PAH concentration varied from 3.85 to 89.2 ng g(-1). The ratio between selected PAH isomers showed that combustion of biomass, coal, and petroleum is the rnain source of PAHs in the study area. The concentrations of the faecal sterols coprostanol and epicoprostanol were below the detection limits, suggesting that sewage was not a significant contributor to sedimentary organic matter. The concentrations of the trace metals (As, Cr, Cu, Ni, Pb and Zn) were low, except near sites located at the mouths of rivers that discharge into the study area and near urbanised regions (Paranagua city and the adjoining harbour). In general, the concentrations of PAHs were below the threshold effect concentrations (TEL) levels. Although the As, Cr and Ni concentrations were above the TEL levels, the study area can be considered as preserved from human activities.
Resumo:
The objectives of the present study were to characterize and define homogenous production environments of composite beef cattle in Brazil in terms of climatic and geographic variables using multivariate exploratory techniques and to use them to assess the presence of G x E for birth weight (BW) and weaning weight (WW). Data from animals born between 1995 and 2008 on 36 farms located in 27 municipalities of the Brazilian states were used. Fifteen years of climate observations (mean minimum and maximum annual temperature and mean annual rainfall) and geographic (latitude, longitude and altitude) data were obtained for each municipality where the farms were located for characterization of the production environments. Hierarchical and nonhierarchical cluster analysis was used to group farms located in regions with similar environmental variables into clusters. Six clusters of farms were formed. The effect of sire-cluster interaction was tested by single-trait analysis using deviance information criterion (DIC). Genetic parameters were estimated by multi-trait analysis considering the same trait to be different in each cluster. According to the values of DIC, the inclusion of sire-cluster effect did not improve the fit of the genetic evaluation model for BW and WW. Estimates of genetic correlations among clusters ranged from -0.02 to 0.92. The low genetic correlation among the most studied regions permits us to suggest that a separate genetic evaluation for some regions should be undertaken. (C) 2012 Elsevier B.V. All rights reserved.
Resumo:
Abstract Background Once multi-relational approach has emerged as an alternative for analyzing structured data such as relational databases, since they allow applying data mining in multiple tables directly, thus avoiding expensive joining operations and semantic losses, this work proposes an algorithm with multi-relational approach. Methods Aiming to compare traditional approach performance and multi-relational for mining association rules, this paper discusses an empirical study between PatriciaMine - an traditional algorithm - and its corresponding multi-relational proposed, MR-Radix. Results This work showed advantages of the multi-relational approach in performance over several tables, which avoids the high cost for joining operations from multiple tables and semantic losses. The performance provided by the algorithm MR-Radix shows faster than PatriciaMine, despite handling complex multi-relational patterns. The utilized memory indicates a more conservative growth curve for MR-Radix than PatriciaMine, which shows the increase in demand of frequent items in MR-Radix does not result in a significant growth of utilized memory like in PatriciaMine. Conclusion The comparative study between PatriciaMine and MR-Radix confirmed efficacy of the multi-relational approach in data mining process both in terms of execution time and in relation to memory usage. Besides that, the multi-relational proposed algorithm, unlike other algorithms of this approach, is efficient for use in large relational databases.
Resumo:
Unlike traditional wireless networks, characterized by the presence of last-mile, static and reliable infrastructures, Mobile ad Hoc Networks (MANETs) are dynamically formed by collections of mobile and static terminals that exchange data by enabling each other's communication. Supporting multi-hop communication in a MANET is a challenging research area because it requires cooperation between different protocol layers (MAC, routing, transport). In particular, MAC and routing protocols could be considered mutually cooperative protocol layers. When a route is established, the exposed and hidden terminal problems at MAC layer may decrease the end-to-end performance proportionally with the length of each route. Conversely, the contention at MAC layer may cause a routing protocol to respond by initiating new routes queries and routing table updates. Multi-hop communication may also benefit the presence of pseudo-centralized virtual infrastructures obtained by grouping nodes into clusters. Clustering structures may facilitate the spatial reuse of resources by increasing the system capacity: at the same time, the clustering hierarchy may be used to coordinate transmissions events inside the network and to support intra-cluster routing schemes. Again, MAC and clustering protocols could be considered mutually cooperative protocol layers: the clustering scheme could support MAC layer coordination among nodes, by shifting the distributed MAC paradigm towards a pseudo-centralized MAC paradigm. On the other hand, the system benefits of the clustering scheme could be emphasized by the pseudo-centralized MAC layer with the support for differentiated access priorities and controlled contention. In this thesis, we propose cross-layer solutions involving joint design of MAC, clustering and routing protocols in MANETs. As main contribution, we study and analyze the integration of MAC and clustering schemes to support multi-hop communication in large-scale ad hoc networks. A novel clustering protocol, named Availability Clustering (AC), is defined under general nodes' heterogeneity assumptions in terms of connectivity, available energy and relative mobility. On this basis, we design and analyze a distributed and adaptive MAC protocol, named Differentiated Distributed Coordination Function (DDCF), whose focus is to implement adaptive access differentiation based on the node roles, which have been assigned by the upper-layer's clustering scheme. We extensively simulate the proposed clustering scheme by showing its effectiveness in dominating the network dynamics, under some stressing mobility models and different mobility rates. Based on these results, we propose a possible application of the cross-layer MAC+Clustering scheme to support the fast propagation of alert messages in a vehicular environment. At the same time, we investigate the integration of MAC and routing protocols in large scale multi-hop ad-hoc networks. A novel multipath routing scheme is proposed, by extending the AOMDV protocol with a novel load-balancing approach to concurrently distribute the traffic among the multiple paths. We also study the composition effect of a IEEE 802.11-based enhanced MAC forwarding mechanism called Fast Forward (FF), used to reduce the effects of self-contention among frames at the MAC layer. The protocol framework is modelled and extensively simulated for a large set of metrics and scenarios. For both the schemes, the simulation results reveal the benefits of the cross-layer MAC+routing and MAC+clustering approaches over single-layer solutions.
Resumo:
Traditional software engineering approaches and metaphors fall short when applied to areas of growing relevance such as electronic commerce, enterprise resource planning, and mobile computing: such areas, in fact, generally call for open architectures that may evolve dynamically over time so as to accommodate new components and meet new requirements. This is probably one of the main reasons that the agent metaphor and the agent-oriented paradigm are gaining momentum in these areas. This thesis deals with the engineering of complex software systems in terms of the agent paradigm. This paradigm is based on the notions of agent and systems of interacting agents as fundamental abstractions for designing, developing and managing at runtime typically distributed software systems. However, today the engineer often works with technologies that do not support the abstractions used in the design of the systems. For this reason the research on methodologies becomes the basic point in the scientific activity. Currently most agent-oriented methodologies are supported by small teams of academic researchers, and as a result, most of them are in an early stage and still in the first context of mostly \academic" approaches for agent-oriented systems development. Moreover, such methodologies are not well documented and very often defined and presented only by focusing on specific aspects of the methodology. The role played by meta- models becomes fundamental for comparing and evaluating the methodologies. In fact a meta-model specifies the concepts, rules and relationships used to define methodologies. Although it is possible to describe a methodology without an explicit meta-model, formalising the underpinning ideas of the methodology in question is valuable when checking its consistency or planning extensions or modifications. A good meta-model must address all the different aspects of a methodology, i.e. the process to be followed, the work products to be generated and those responsible for making all this happen. In turn, specifying the work products that must be developed implies dening the basic modelling building blocks from which they are built. As a building block, the agent abstraction alone is not enough to fully model all the aspects related to multi-agent systems in a natural way. In particular, different perspectives exist on the role that environment plays within agent systems: however, it is clear at least that all non-agent elements of a multi-agent system are typically considered to be part of the multi-agent system environment. The key role of environment as a first-class abstraction in the engineering of multi-agent system is today generally acknowledged in the multi-agent system community, so environment should be explicitly accounted for in the engineering of multi-agent system, working as a new design dimension for agent-oriented methodologies. At least two main ingredients shape the environment: environment abstractions - entities of the environment encapsulating some functions -, and topology abstractions - entities of environment that represent the (either logical or physical) spatial structure. In addition, the engineering of non-trivial multi-agent systems requires principles and mechanisms for supporting the management of the system representation complexity. These principles lead to the adoption of a multi-layered description, which could be used by designers to provide different levels of abstraction over multi-agent systems. The research in these fields has lead to the formulation of a new version of the SODA methodology where environment abstractions and layering principles are exploited for en- gineering multi-agent systems.
Resumo:
It is well known that the deposition of gaseous pollutants and aerosols plays a major role in causing the deterioration of monuments and built cultural heritage in European cities. Despite of many studies dedicated to the environmental damage of cultural heritage, in case of cement mortars, commonly used in the 20th century architecture, the deterioration due to air multipollutants impact, especially the formation of black crusts, is still not well explored making this issue a challenging area of research. This work centers on cement mortars – environment interactions, focusing on the diagnosis of the damage on the modern built heritage due to air multi-pollutants. For this purpose three sites, exposed to different urban areas in Europe, were selected for sampling and subsequent laboratory analyses: Centennial Hall, Wroclaw (Poland), Chiesa dell'Autostrada del Sole, Florence (Italy), Casa Galleria Vichi, Florence (Italy). The sampling sessions were performed taking into account the height from the ground level and protection from rain run off (sheltered, partly sheltered and exposed areas). The complete characterization of collected damage layer and underlying materials was performed using a range of analytical techniques: optical and scanning electron microscopy, X ray diffractometry, differential and gravimetric thermal analysis, ion chromatography, flash combustion/gas chromatographic analysis, inductively coupled plasma-optical emission spectrometer. The data were elaborated using statistical methods (i.e. principal components analyses) and enrichment factor for cement mortars was calculated for the first time. The results obtained from the experimental activity performed on the damage layers indicate that gypsum, due to the deposition of atmospheric sulphur compounds, is the main damage product at surfaces sheltered from rain run-off at Centennial Hall and Casa Galleria Vichi. By contrast, gypsum has not been identified in the samples collected at Chiesa dell'Autostrada del Sole. This is connected to the restoration works, particularly surface cleaning, regularly performed for the maintenance of the building. Moreover, the results obtained demonstrated the correlation between the location of the building and the composition of the damage layer: Centennial Hall is mainly undergoing to the impact of pollutants emitted from the close coal power stations, whilst Casa Galleria Vichi is principally affected by pollutants from vehicular exhaust in front of the building.
Resumo:
The hierarchical organisation of biological systems plays a crucial role in the pattern formation of gene expression resulting from the morphogenetic processes, where autonomous internal dynamics of cells, as well as cell-to-cell interactions through membranes, are responsible for the emergent peculiar structures of the individual phenotype. Being able to reproduce the systems dynamics at different levels of such a hierarchy might be very useful for studying such a complex phenomenon of self-organisation. The idea is to model the phenomenon in terms of a large and dynamic network of compartments, where the interplay between inter-compartment and intra-compartment events determines the emergent behaviour resulting in the formation of spatial patterns. According to these premises the thesis proposes a review of the different approaches already developed in modelling developmental biology problems, as well as the main models and infrastructures available in literature for modelling biological systems, analysing their capabilities in tackling multi-compartment / multi-level models. The thesis then introduces a practical framework, MS-BioNET, for modelling and simulating these scenarios exploiting the potential of multi-level dynamics. This is based on (i) a computational model featuring networks of compartments and an enhanced model of chemical reaction addressing molecule transfer, (ii) a logic-oriented language to flexibly specify complex simulation scenarios, and (iii) a simulation engine based on the many-species/many-channels optimised version of Gillespie’s direct method. The thesis finally proposes the adoption of the agent-based model as an approach capable of capture multi-level dynamics. To overcome the problem of parameter tuning in the model, the simulators are supplied with a module for parameter optimisation. The task is defined as an optimisation problem over the parameter space in which the objective function to be minimised is the distance between the output of the simulator and a target one. The problem is tackled with a metaheuristic algorithm. As an example of application of the MS-BioNET framework and of the agent-based model, a model of the first stages of Drosophila Melanogaster development is realised. The model goal is to generate the early spatial pattern of gap gene expression. The correctness of the models is shown comparing the simulation results with real data of gene expression with spatial and temporal resolution, acquired in free on-line sources.
Resumo:
While the use of distributed intelligence has been incrementally spreading in the design of a great number of intelligent systems, the field of Artificial Intelligence in Real Time Strategy games has remained mostly a centralized environment. Despite turn-based games have attained AIs of world-class level, the fast paced nature of RTS games has proven to be a significant obstacle to the quality of its AIs. Chapter 1 introduces RTS games describing their characteristics, mechanics and elements. Chapter 2 introduces Multi-Agent Systems and the use of the Beliefs-Desires-Intentions abstraction, analysing the possibilities given by self-computing properties. In Chapter 3 the current state of AI development in RTS games is analyzed highlighting the struggles of the gaming industry to produce valuable. The focus on improving multiplayer experience has impacted gravely on the quality of the AIs thus leaving them with serious flaws that impair their ability to challenge and entertain players. Chapter 4 explores different aspects of AI development for RTS, evaluating the potential strengths and weaknesses of an agent-based approach and analysing which aspects can benefit the most against centralized AIs. Chapter 5 describes a generic agent-based framework for RTS games where every game entity becomes an agent, each of which having its own knowledge and set of goals. Different aspects of the game, like economy, exploration and warfare are also analysed, and some agent-based solutions are outlined. The possible exploitation of self-computing properties to efficiently organize the agents activity is then inspected. Chapter 6 presents the design and implementation of an AI for an existing Open Source game in beta development stage: 0 a.d., an historical RTS game on ancient warfare which features a modern graphical engine and evolved mechanics. The entities in the conceptual framework are implemented in a new agent-based platform seamlessly nested inside the existing game engine, called ABot, widely described in Chapters 7, 8 and 9. Chapter 10 and 11 include the design and realization of a new agent based language useful for defining behavioural modules for the agents in ABot, paving the way for a wider spectrum of contributors. Chapter 12 concludes the work analysing the outcome of tests meant to evaluate strategies, realism and pure performance, finally drawing conclusions and future works in Chapter 13.
Resumo:
The thesis objectives are to develop new methodologies for study of the space and time variability of Italian upper ocean ecosystem through the combined use of multi-sensors satellite data and in situ observations and to identify the capability and limits of remote sensing observations to monitor the marine state at short and long time scales. Three oceanographic basins have been selected and subjected to different types of analyses. The first region is the Tyrrhenian Sea where a comparative analysis of altimetry and lagrangian measurements was carried out to study the surface circulation. The results allowed to deepen the knowledge of the Tyrrhenian Sea surface dynamics and its variability and to defined the limitations of satellite altimetry measurements to detect small scale marine circulation features. Channel of Sicily study aimed to identify the spatial-temporal variability of phytoplankton biomass and to understand the impact of the upper ocean circulation on the marine ecosystem. An combined analysis of the satellite of long term time series of chlorophyll, Sea Surface Temperature and Sea Level field data was applied. The results allowed to identify the key role of the Atlantic water inflow in modulating the seasonal variability of the phytoplankton biomass in the region. Finally, Italian coastal marine system was studied with the objective to explore the potential capability of Ocean Color data in detecting chlorophyll trend in coastal areas. The most appropriated methodology to detect long term environmental changes was defined through intercomparison of chlorophyll trends detected by in situ and satellite. Then, Italian coastal areas subject to eutrophication problems were identified. This work has demonstrated that satellites data constitute an unique opportunity to define the features and forcing influencing the upper ocean ecosystems dynamics and can be used also to monitor environmental variables capable of influencing phytoplankton productivity.
Resumo:
Membrane proteins play a major role in every living cell. They are the key factors in the cell’s metabolism and in other functions, for example in cell-cell interaction, signal transduction, and transport of ions and nutrients. Cytochrome c oxidase (CcO), as one of the membrane proteins of the respiratory chain, plays a significant role in the energy transformation of higher organisms. CcO is a multi centered heme protein, utilizing redox energy to actively transport protons across the mitochondrial membrane. One aim of this dissertation is to investigate single steps in the mechanism of the ion transfer process coupled to electron transfer, which are not fully understood. The protein-tethered bilayer lipid membrane is a general approach to immobilize membrane proteins in an oriented fashion on a planar electrode embedded in a biomimetic membrane. This system enables the combination of electrochemical techniques with surface enhanced resonance Raman (SERRS), surface enhanced reflection absorption infrared (SEIRAS), and surface plasmon spectroscopy to study protein mediated electron and ion transport processes. The orientation of the enzymes within the surface confined architecture can be controlled by specific site-mutations, i.e. the insertion of a poly-histidine tag to different subunits of the enzyme. CcO can, thus, be oriented uniformly with its natural electron pathway entry pointing either towards or away from the electrode surface. The first orientation allows an ultra-fast direct electron transfer(ET) into the protein, not provided by conventional systems, which can be leveraged to study intrinsic charge transfer processes. The second orientation permits to study the interaction with its natural electron donor cytochrome c. Electrochemical and SERR measurements show conclusively that the redox site structure and the activity of the surface confined enzyme are preserved. Therefore, this biomimetic system offers a unique platform to study the kinetics of the ET processes in order to clarify mechanistic properties of the enzyme. Highly sensitive and ultra fast electrochemical techniques allow the separation of ET steps between all four redox centres including the determination of ET rates. Furthermore, proton transfer coupled to ET could be directly measured and discriminated from other ion transfer processes, revealing novel mechanistic information of the proton transfer mechanism of cytochrome c oxidase. In order to study the kinetics of the ET inside the protein, including the catalytic center, time resolved SEIRAS and SERRS measurements were performed to gain more insight into the structural and coordination changes of the heme environment. The electrical behaviour of tethered membrane systems and membrane intrinsic proteins as well as related charge transfer processes were simulated by solving the respective sets of differential equations, utilizing a software package called SPICE. This helps to understand charge transfer processes across membranes and to develop models that can help to elucidate mechanisms of complex enzymatic processes.