936 resultados para Interdisciplinary approach to knowledge
Resumo:
In the past few years, vehicular ad hoc networks(VANETs) was studied extensively by researchers. VANETs is a type of P2P network, though it has some distinct characters (fast moving, short lived connection etc.). In this paper, we present several limitations of current trust management schemes in VANETs and propose ways to counter them. We first review several trust management techniques in VANETs and argue that the ephemeral nature of VANETs render them useless in practical situations. We identify that the problem of information cascading and oversampling, which commonly arise in social networks, also adversely affects trust management schemes in VANETs. To the best of our knowledge, we are the first to introduce information cascading and oversampling to VANETs. We show that simple voting for decision making leads to oversampling and gives incorrect results in VANETs. To overcome this problem, we propose a novel voting scheme. In our scheme, each vehicle has different voting weight according to its distance from the event. The vehicle which is more closer to the event possesses higher weight. Simulations show that our proposed algorithm performs better than simple voting, increasing the correctness of voting. © 2012 Springer Science + Business Media, LLC.
Resumo:
Fertilization of guava relies on soil and tissue testing. The interpretation of tissue test is currently conducted by comparing nutrient concentrations or dual ratios with critical values or ranges. The critical value approach is affected by nutrient interactions. Nutrient interactions can be described by dual ratios where two nutrients are compressed into a single expression or a ternary diagrams where one redundant proportion can be computed by difference between 100% and the sum of the other two. There are D(D-1) possible dual ratios in a D-parts composition and most of them are thus redundant. Nutrients are components of a mixture that convey relative, not absolute information on the composition. There are D-1 balances between components or ingredients in any mixture. Compositional data are intrinsically redundant, scale dependent and non-normally distributed. Based on the principles of equilibrium and orthogonality, the nutrient balance concept projects D-1 isometric log ratio (ilr) coordinates into the Euclidean space. The D-1 balances between groups of nutrients are ordered to reflect knowledge in plant physiology, soil fertility and crop management. Our objective was to evaluate the ilr approach using nutrient data from a guava orchard survey and fertilizer trials across the state of São Paulo, Brazil. Cationic balances varied widely between orchards. We found that the Redfield N/P ratio of 13 was critical for high guava yield. We present guava yield maps in ternary diagrams. Although the ratio between nutrients changing in the same direction with time is often assumed to be stationary, most guava nutrient balances and dual ratios were found to be non-stationary. The ilr model provided an unbiased nutrient diagnosis of guava. © ISHS.
Resumo:
Spanish version available
Resumo:
Bio-molecular computing, 'computations performed by bio-molecules', is already challenging traditional approaches to computation both theoretically and technologically. Often placed within the wider context of ´bio-inspired' or 'natural' or even 'unconventional' computing, the study of natural and artificial molecular computations is adding to our understanding of biology, physical sciences and computer science well beyond the framework of existing design and implementation paradigms. In this introduction, We wish to outline the current scope of the field and assemble some basic arguments that, bio-molecular computation is of central importance to computer science, physical sciences and biology using HOL - Higher Order Logic. HOL is used as the computational tool in our R&D work. DNA was analyzed as a chemical computing engine, in our effort to develop novel formalisms to understand the molecular scale bio-chemical computing behavior using HOL. In our view, our focus is one of the pioneering efforts in this promising domain of nano-bio scale chemical information processing dynamics.
Resumo:
Connectivity is the basic factor for the proper operation of any wireless network. In a mobile wireless sensor network it is a challenge for applications and protocols to deal with connectivity problems, as links might get up and down frequently. In these scenarios, having knowledge of the node remaining connectivity time could both improve the performance of the protocols (e.g. handoff mechanisms) and save possible scarce nodes resources (CPU, bandwidth, and energy) by preventing unfruitful transmissions. The current paper provides a solution called Genetic Machine Learning Algorithm (GMLA) to forecast the remainder connectivity time in mobile environments. It consists in combining Classifier Systems with a Markov chain model of the RF link quality. The main advantage of using an evolutionary approach is that the Markov model parameters can be discovered on-the-fly, making it possible to cope with unknown environments and mobility patterns. Simulation results show that the proposal is a very suitable solution, as it overcomes the performance obtained by similar approaches.
Resumo:
Biocompatible inorganic nano- and microcarriers can be suitable candidates for protein delivery. This study demonstrates facile methods of functionalization by using nanoscale linker molecules to change the protein adsorption capacity of hydroxyapatite (HA) powder. The adsorption capacity of bovine serum albumin as a model protein has been studied with respect to the surface modifications. The selected linker molecules (lysine, arginine, and phosphoserine) can influence the adsorption capacity by changing the electrostatic nature of the HA surface. Qualitative and quantitative analyses of linker-molecule interactions with the HA surface have been performed by using NMR spectroscopy, zeta-potential measurements, X-ray photoelectron spectroscopy, and thermogravimetric analyses. Additionally, correlations to theoretical isotherm models have been calculated with respect to Langmuir and Freundlich isotherms. Lysine and arginine increased the protein adsorption, whereas phosphoserine reduced the protein adsorption. The results show that the adsorption capacity can be controlled with different functionalization, depending on the protein-carrier selections under consideration. The scientific knowledge acquired from this study can be applied in various biotechnological applications that involve biomolecule-inorganic material interfaces.
Resumo:
Canada Geese overflying the runways at London’s Heathrow Airport have been struck on eleven occasions by aircraft during the last ten years. Four of these occurred during the pre-breeding season and seven during the post moult period. A monitoring study was initiated in 1999 to evaluate the movements of geese around the airport and determine appropriate mitigation strategies to reduce the risk of birdstrike. Moult sites within 13km of the airport were identified. 4,900 moulting geese were caught and fitted with colour rings and radio-transmitters between 1999 and 2004. 2,500 visits were made to over 300 sites resulting in over 10,000 sightings of known individuals. Birds that crossed the airport approaches whilst moving between roost sites and feeding areas in newly harvested cereal crops were identified. Throughout the monitoring period efforts were made to control the risk, but by 2003 it was estimated that 10,000 bird transits of the approaches involving almost 700 individuals occurred during a 50 day period. The knowledge of the movements of ringed and tagged birds was used to inform a revised habitat management, daily roost dispersal and on-airfield bird deterrence programme in 2004. By adopting a flexible approach to management, an estimated 70% reduction in bird transits was achieved. This paper discusses the techniques used to achieve this reduction.
Resumo:
This work describes a methodology to simulate free surface incompressible multiphase flows. This novel methodology allows the simulation of multiphase flows with an arbitrary number of phases, each of them having different densities and viscosities. Surface and interfacial tension effects are also included. The numerical technique is based on the GENSMAC front-tracking method. The velocity field is computed using a finite-difference discretization of a modification of the NavierStokes equations. These equations together with the continuity equation are solved for the two-dimensional multiphase flows, with different densities and viscosities in the different phases. The governing equations are solved on a regular Eulerian grid, and a Lagrangian mesh is employed to track free surfaces and interfaces. The method is validated by comparing numerical with analytic results for a number of simple problems; it was also employed to simulate complex problems for which no analytic solutions are available. The method presented in this paper has been shown to be robust and computationally efficient. Copyright (c) 2012 John Wiley & Sons, Ltd.
Resumo:
Organizational intelligence can be seen as a function of the viable structure of an organization. With the integration of the Viable System Model and Soft Systems Methodology (systemic approaches of organizational management) focused on the role of the intelligence function, it is possible to elaborate a model of action with a structured methodology to prospect, select, treat and distribute information to the entire organization that improves the efficacy and efficiency of all processes. This combination of methodologies is called Intelligence Systems Methodology (ISM) whose assumptions and dynamics are delimited in this paper. The ISM is composed of two simultaneous activities: the Active Environmental Mapping and the Stimulated Action Cycle. The elaboration of the formal ISM description opens opportunities for applications of the methodology on real situations, offering a new path for this specific issue of systems thinking: the intelligence systems. Knowledge Management Research & Practice (2012) 10, 141-152. doi:10.1057/kmrp.2011.44
Resumo:
In spite of the high prevalence and negative impact of depression, little is known about its pathophysiology. Basic research on depression needs new animal models in order to increase knowledge of the disease and search for new therapies. The work presented here aims to provide a neurobiologically validated model for investigating the relationships among sickness behavior, antidepressants treatment, and social dominance behavior. For this purpose, dominant individuals from dyads of male Swiss mice were treated with the bacterial endotoxin lipopolysaccharide (LPS) to induce social hierarchy destabilization. Two groups were treated with the antidepressants imipramine and fluoxetine prior to LPS administration. In these groups, antidepressant treatment prevented the occurrence of social destabilization. These results indicate that this model could be useful in providing new insights into the understanding of the brain systems involved in depression.
Resumo:
Abstract Background Recent medical and biological technology advances have stimulated the development of new testing systems that have been providing huge, varied amounts of molecular and clinical data. Growing data volumes pose significant challenges for information processing systems in research centers. Additionally, the routines of genomics laboratory are typically characterized by high parallelism in testing and constant procedure changes. Results This paper describes a formal approach to address this challenge through the implementation of a genetic testing management system applied to human genome laboratory. We introduced the Human Genome Research Center Information System (CEGH) in Brazil, a system that is able to support constant changes in human genome testing and can provide patients updated results based on the most recent and validated genetic knowledge. Our approach uses a common repository for process planning to ensure reusability, specification, instantiation, monitoring, and execution of processes, which are defined using a relational database and rigorous control flow specifications based on process algebra (ACP). The main difference between our approach and related works is that we were able to join two important aspects: 1) process scalability achieved through relational database implementation, and 2) correctness of processes using process algebra. Furthermore, the software allows end users to define genetic testing without requiring any knowledge about business process notation or process algebra. Conclusions This paper presents the CEGH information system that is a Laboratory Information Management System (LIMS) based on a formal framework to support genetic testing management for Mendelian disorder studies. We have proved the feasibility and showed usability benefits of a rigorous approach that is able to specify, validate, and perform genetic testing using easy end user interfaces.
Resumo:
Remanufacturing is the process of rebuilding used products that ensures that the quality of remanufactured products is equivalent to that of new ones. Although the theme is gaining ground, it is still little explored due to lack of knowledge, the difficulty of visualizing it systemically, and implementing it effectively. Few models treat remanufacturing as a system. Most of the studies still treated remanufacturing as an isolated process, preventing it from being seen in an integrated manner. Therefore, the aim of this work is to organize the knowledge about remanufacturing, offering a vision of remanufacturing system and contributing to an integrated view about the theme. The methodology employed was a literature review, adopting the General Theory of Systems to characterize the remanufacturing system. This work consolidates and organizes the elements of this system, enabling a better understanding of remanufacturing and assisting companies in adopting the concept.
Resumo:
Electronic business surely represents the new development perspective for world-wide trade. Together with the idea of ebusiness, and the exigency to exchange business messages between trading partners, the concept of business-to-business (B2B) integration arouse. B2B integration is becoming necessary to allow partners to communicate and exchange business documents, like catalogues, purchase orders, reports and invoices, overcoming architectural, applicative, and semantic differences, according to the business processes implemented by each enterprise. Business relationships can be very heterogeneous, and consequently there are variousways to integrate enterprises with each other. Moreover nowadays not only large enterprises, but also the small- and medium- enterprises are moving towards ebusiness: more than two-thirds of Small and Medium Enterprises (SMEs) use the Internet as a business tool. One of the business areas which is actively facing the interoperability problem is that related with the supply chain management. In order to really allow the SMEs to improve their business and to fully exploit ICT technologies in their business transactions, there are three main players that must be considered and joined: the new emerging ICT technologies, the scenario and the requirements of the enterprises and the world of standards and standardisation bodies. This thesis presents the definition and the development of an interoperability framework (and the bounded standardisation intiatives) to provide the Textile/Clothing sectorwith a shared set of business documents and protocols for electronic transactions. Considering also some limitations, the thesis proposes a ontology-based approach to improve the functionalities of the developed framework and, exploiting the technologies of the semantic web, to improve the standardisation life-cycle, intended as the development, dissemination and adoption of B2B protocols for specific business domain. The use of ontologies allows the semantic modellisation of knowledge domains, upon which it is possible to develop a set of components for a better management of B2B protocols, and to ease their comprehension and adoption for the target users.
Resumo:
Motion control is a sub-field of automation, in which the position and/or velocity of machines are controlled using some type of device. In motion control the position, velocity, force, pressure, etc., profiles are designed in such a way that the different mechanical parts work as an harmonious whole in which a perfect synchronization must be achieved. The real-time exchange of information in the distributed system that is nowadays an industrial plant plays an important role in order to achieve always better performance, better effectiveness and better safety. The network for connecting field devices such as sensors, actuators, field controllers such as PLCs, regulators, drive controller etc., and man-machine interfaces is commonly called fieldbus. Since the motion transmission is now task of the communication system, and not more of kinematic chains as in the past, the communication protocol must assure that the desired profiles, and their properties, are correctly transmitted to the axes then reproduced or else the synchronization among the different parts is lost with all the resulting consequences. In this thesis, the problem of trajectory reconstruction in the case of an event-triggered communication system is faced. The most important feature that a real-time communication system must have is the preservation of the following temporal and spatial properties: absolute temporal consistency, relative temporal consistency, spatial consistency. Starting from the basic system composed by one master and one slave and passing through systems made up by many slaves and one master or many masters and one slave, the problems in the profile reconstruction and temporal properties preservation, and subsequently the synchronization of different profiles in network adopting an event-triggered communication system, have been shown. These networks are characterized by the fact that a common knowledge of the global time is not available. Therefore they are non-deterministic networks. Each topology is analyzed and the proposed solution based on phase-locked loops adopted for the basic master-slave case has been improved to face with the other configurations.
Resumo:
The objective of this dissertation is to develop and test a predictive model for the passive kinematics of human joints based on the energy minimization principle. To pursue this goal, the tibio-talar joint is chosen as a reference joint, for the reduced number of bones involved and its simplicity, if compared with other sinovial joints such as the knee or the wrist. Starting from the knowledge of the articular surface shapes, the spatial trajectory of passive motion is obtained as the envelop of joint configurations that maximize the surfaces congruence. An increase in joint congruence corresponds to an improved capability of distributing an applied load, allowing the joint to attain a better strength with less material. Thus, joint congruence maximization is a simple geometric way to capture the idea of joint energy minimization. The results obtained are validated against in vitro measured trajectories. Preliminary comparison provide strong support for the predictions of the theoretical model.