864 resultados para Tree based intercrop systems


Relevância:

100.00% 100.00%

Publicador:

Resumo:

This project attempts to provide an in-depth competitive assessment of the Portuguese indoor location-based analytics market, and to elaborate an entry-pricing strategy for Business Intelligence Positioning System (BIPS) implementation in Portuguese shopping centre stores. The role of industry forces and company’s organizational resources platform to sustain company’s competitive advantage was explored. A customer value-based pricing approach was adopted to assess BIPS value to retailers and maximize Sonae Sierra profitability. The exploratory quantitative research found that there is a market opportunity to explore every store area types with tailored proposals, and to set higher-than-tested membership fees to allow a rapid ROI, concluding there are propitious conditions for Sierra to succeed in BIPS store’s business model in Portugal.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The development of organic materials displaying high two-photon absorption (TPA) has attracted much attention in recent years due to a variety of potential applications in photonics and optoelectronics, such as three-dimensional optical data storage, fluorescence imaging, two-photon microscopy, optical limiting, microfabrication, photodynamic therapy, upconverted lasing, etc. The most frequently employed structural motifs for TPA materials are donor–pi bridge–acceptor (D–pi–A) dipoles, donor–pi bridge–donor (D–pi–D) and acceptor–pi bridge-acceptor (A–pi–A) quadrupoles, octupoles, etc. In this work we present the synthesis and photophysical characterization of quadrupolar heterocyclic systems with potential applications in materials and biological sciences as TPA chromophores. Indole is a versatile building block for the synthesis of heterocyclic systems for several optoelectronic applications (chemosensors, nonlinear optical, OLEDs) due to its photophysical properties and donor electron ability and 4H-pyran-4-ylidene fragment is frequently used for the synthesis of red light-emitting materials. On the other hand, 2-(2,6-dimethyl-4H-pyran-4-ylidene)malononitrile (1) and 1,3-diethyl-dihydro-5-(2,6-dimethyl-4H-pyran-4-ylidene)-2-thiobarbituric (2) units are usually used as strong acceptor moieties for the preparation of π-conjugated systems of the push-pull type. These building blocks were prepared by Knoevenagel condensation of the corresponding ketone precursor with malononitrile or 1,3-diethyl-dihydro-2-thiobarbituric acid. The new quadrupolar 4H-pyran-4-ylidene fluorophores (3) derived from indole were prepared through condensation of 5-methyl-1H-indole-3-carbaldehyde with the acceptor precursors 1 and 2, in the presence of a catalytical amount of piperidine. The new compounds were characterized by the usual spectroscopic techniques (UV-vis., FT-IR and multinuclear NMR - 1H, 13C).

Relevância:

100.00% 100.00%

Publicador:

Resumo:

[Excerpt] In this work, different multilayer structures, using a polyhydroxybutyrate-co-valerate film with a valerate content of 8% (PHBV8) as support, were developed aiming the development of active bio-based multilayer systems. An interlayer based on zein nanofibers with and without cinnamaldehyde were electrospun in the PHBV8 film and three multilayer systems were developed: 1) without an outer layer; 2) using a PHBV8 film as outer layer; and 3) using an alginate-based film as outer layer. Their physico-chemical properties were evaluated through: water vapour and oxygen permeabilities and colour measurements, Fourier Transform Infrared Spectroscopy (FTIR) and thermal analyses. Results showed that the presence of different outer layers affected the water vapour permeability and transparency of the multilayer films. (...)

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Abstract This thesis proposes a set of adaptive broadcast solutions and an adaptive data replication solution to support the deployment of P2P applications. P2P applications are an emerging type of distributed applications that are running on top of P2P networks. Typical P2P applications are video streaming, file sharing, etc. While interesting because they are fully distributed, P2P applications suffer from several deployment problems, due to the nature of the environment on which they perform. Indeed, defining an application on top of a P2P network often means defining an application where peers contribute resources in exchange for their ability to use the P2P application. For example, in P2P file sharing application, while the user is downloading some file, the P2P application is in parallel serving that file to other users. Such peers could have limited hardware resources, e.g., CPU, bandwidth and memory or the end-user could decide to limit the resources it dedicates to the P2P application a priori. In addition, a P2P network is typically emerged into an unreliable environment, where communication links and processes are subject to message losses and crashes, respectively. To support P2P applications, this thesis proposes a set of services that address some underlying constraints related to the nature of P2P networks. The proposed services include a set of adaptive broadcast solutions and an adaptive data replication solution that can be used as the basis of several P2P applications. Our data replication solution permits to increase availability and to reduce the communication overhead. The broadcast solutions aim, at providing a communication substrate encapsulating one of the key communication paradigms used by P2P applications: broadcast. Our broadcast solutions typically aim at offering reliability and scalability to some upper layer, be it an end-to-end P2P application or another system-level layer, such as a data replication layer. Our contributions are organized in a protocol stack made of three layers. In each layer, we propose a set of adaptive protocols that address specific constraints imposed by the environment. Each protocol is evaluated through a set of simulations. The adaptiveness aspect of our solutions relies on the fact that they take into account the constraints of the underlying system in a proactive manner. To model these constraints, we define an environment approximation algorithm allowing us to obtain an approximated view about the system or part of it. This approximated view includes the topology and the components reliability expressed in probabilistic terms. To adapt to the underlying system constraints, the proposed broadcast solutions route messages through tree overlays permitting to maximize the broadcast reliability. Here, the broadcast reliability is expressed as a function of the selected paths reliability and of the use of available resources. These resources are modeled in terms of quotas of messages translating the receiving and sending capacities at each node. To allow a deployment in a large-scale system, we take into account the available memory at processes by limiting the view they have to maintain about the system. Using this partial view, we propose three scalable broadcast algorithms, which are based on a propagation overlay that tends to the global tree overlay and adapts to some constraints of the underlying system. At a higher level, this thesis also proposes a data replication solution that is adaptive both in terms of replica placement and in terms of request routing. At the routing level, this solution takes the unreliability of the environment into account, in order to maximize reliable delivery of requests. At the replica placement level, the dynamically changing origin and frequency of read/write requests are analyzed, in order to define a set of replica that minimizes communication cost.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Abstract Sitting between your past and your future doesn't mean you are in the present. Dakota Skye Complex systems science is an interdisciplinary field grouping under the same umbrella dynamical phenomena from social, natural or mathematical sciences. The emergence of a higher order organization or behavior, transcending that expected of the linear addition of the parts, is a key factor shared by all these systems. Most complex systems can be modeled as networks that represent the interactions amongst the system's components. In addition to the actual nature of the part's interactions, the intrinsic topological structure of underlying network is believed to play a crucial role in the remarkable emergent behaviors exhibited by the systems. Moreover, the topology is also a key a factor to explain the extraordinary flexibility and resilience to perturbations when applied to transmission and diffusion phenomena. In this work, we study the effect of different network structures on the performance and on the fault tolerance of systems in two different contexts. In the first part, we study cellular automata, which are a simple paradigm for distributed computation. Cellular automata are made of basic Boolean computational units, the cells; relying on simple rules and information from- the surrounding cells to perform a global task. The limited visibility of the cells can be modeled as a network, where interactions amongst cells are governed by an underlying structure, usually a regular one. In order to increase the performance of cellular automata, we chose to change its topology. We applied computational principles inspired by Darwinian evolution, called evolutionary algorithms, to alter the system's topological structure starting from either a regular or a random one. The outcome is remarkable, as the resulting topologies find themselves sharing properties of both regular and random network, and display similitudes Watts-Strogtz's small-world network found in social systems. Moreover, the performance and tolerance to probabilistic faults of our small-world like cellular automata surpasses that of regular ones. In the second part, we use the context of biological genetic regulatory networks and, in particular, Kauffman's random Boolean networks model. In some ways, this model is close to cellular automata, although is not expected to perform any task. Instead, it simulates the time-evolution of genetic regulation within living organisms under strict conditions. The original model, though very attractive by it's simplicity, suffered from important shortcomings unveiled by the recent advances in genetics and biology. We propose to use these new discoveries to improve the original model. Firstly, we have used artificial topologies believed to be closer to that of gene regulatory networks. We have also studied actual biological organisms, and used parts of their genetic regulatory networks in our models. Secondly, we have addressed the improbable full synchronicity of the event taking place on. Boolean networks and proposed a more biologically plausible cascading scheme. Finally, we tackled the actual Boolean functions of the model, i.e. the specifics of how genes activate according to the activity of upstream genes, and presented a new update function that takes into account the actual promoting and repressing effects of one gene on another. Our improved models demonstrate the expected, biologically sound, behavior of previous GRN model, yet with superior resistance to perturbations. We believe they are one step closer to the biological reality.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The objective of this work was to assess the effects of conventional tillage and of different direct seeding mulch-based cropping systems (DMC) on soil nematofauna characteristics. The long-term field experiment was carried out in the highlands of Madagascar on an andic Dystrustept soil. Soil samples were taken once a year during three successive years (14 to 16 years after installation of the treatments) from a 0-5-cm soil layer of a conventional tillage system and of three kinds of DMC: direct seeding on mulch from rotation soybean-maize residues; direct seeding of maize-maize rotation on living mulch of silverleaf (Desmodium uncinatum); direct seeding of bean (Phaseolus vulgaris)-soybean rotation on living mulch of kikuyu grass (Pennisetum clandestinum). The samples were compared with samples from natural fallows. The soil nematofauna, characterized by the abundance of different trophic groups and indices (MI, maturity index; EI and SI, enrichment and structure indices), allowed the discrimination of the different cropping systems. The different DMC treatments had a more complex soil food web than the tillage treatment: SI and MI were significantly greater in DMC systems. Moreover, DMC with dead mulch had a lower density of free-living nematodes than DMC with living mulch, which suggested a lower microbial activity.

Relevância:

100.00% 100.00%

Publicador:

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Data available in the literature were used to develop a warning system for bean angular leaf spot and anthracnose, caused by Phaeoisariopsis griseola and Colletotrichum lindemuthianum, respectively. The model is based on favorable environmental conditions for the infectious process such as continuous leaf wetness duration and mean air temperature during this subphase of the pathogen-host relationship cycle. Equations published by DALLA PRIA (1977) showing the interactions of those two factors on the disease severity were used. Excell spreadsheet was used to calculate the leaf wetness period needed to cause different infection probabilities at different temperature ranges. These data were employed to elaborate critical period tables used to program a computerized electronic device that records leaf wetness duration and mean temperature and automatically shows the daily disease severity value (DDSV) for each disease. The model should be validated in field experiments under natural infection for which the daily disease severity sum (DDSS) should be identified as a criterion to indicate the beginning and the interval of fungicide applications to control both diseases.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Due to various advantages such as flexibility, scalability and updatability, software intensive systems are increasingly embedded in everyday life. The constantly growing number of functions executed by these systems requires a high level of performance from the underlying platform. The main approach to incrementing performance has been the increase of operating frequency of a chip. However, this has led to the problem of power dissipation, which has shifted the focus of research to parallel and distributed computing. Parallel many-core platforms can provide the required level of computational power along with low power consumption. On the one hand, this enables parallel execution of highly intensive applications. With their computational power, these platforms are likely to be used in various application domains: from home use electronics (e.g., video processing) to complex critical control systems. On the other hand, the utilization of the resources has to be efficient in terms of performance and power consumption. However, the high level of on-chip integration results in the increase of the probability of various faults and creation of hotspots leading to thermal problems. Additionally, radiation, which is frequent in space but becomes an issue also at the ground level, can cause transient faults. This can eventually induce a faulty execution of applications. Therefore, it is crucial to develop methods that enable efficient as well as resilient execution of applications. The main objective of the thesis is to propose an approach to design agentbased systems for many-core platforms in a rigorous manner. When designing such a system, we explore and integrate various dynamic reconfiguration mechanisms into agents functionality. The use of these mechanisms enhances resilience of the underlying platform whilst maintaining performance at an acceptable level. The design of the system proceeds according to a formal refinement approach which allows us to ensure correct behaviour of the system with respect to postulated properties. To enable analysis of the proposed system in terms of area overhead as well as performance, we explore an approach, where the developed rigorous models are transformed into a high-level implementation language. Specifically, we investigate methods for deriving fault-free implementations from these models into, e.g., a hardware description language, namely VHDL.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

This thesis entitled Development of nitrifying ans photosynthetic sulfur bacteria based bioaugmentation systems for the bioremediation of ammonia and hydregen sulphide in shrimp culture. the thesis is to propose a sustainable, low cost option for the mitigation of toxic ammonia and hydrogen sulphide in shrimp culture systems. Use of ‘bioaugmentors’ as pond additives is an emerging field in aquaculture. Understanding the role of organisms involved in the ‘bioaugmentor’ will obviously help to optimize conditions for their activity.The thesis describes the use of wood powder immobilization of nitrifying consortia.Shrimp grow out systems are specialized and highly dynamic aquaculture production units which when operated under zero exchange mode require bioremediation of ammonia, nitrite nitrogen and hydrogen sulphide to protect the crop. The research conducted here is to develop an economically viable and user friendly technology for addressing the above problem. The nitrifying bacterial consortia (NBC) generated earlier (Achuthan et al., 2006) were used for developing the technology.Clear demonstration of better quality of immobilized nitrifiers generated in this study for field application.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Software systems are progressively being deployed in many facets of human life. The implication of the failure of such systems, has an assorted impact on its customers. The fundamental aspect that supports a software system, is focus on quality. Reliability describes the ability of the system to function under specified environment for a specified period of time and is used to objectively measure the quality. Evaluation of reliability of a computing system involves computation of hardware and software reliability. Most of the earlier works were given focus on software reliability with no consideration for hardware parts or vice versa. However, a complete estimation of reliability of a computing system requires these two elements to be considered together, and thus demands a combined approach. The present work focuses on this and presents a model for evaluating the reliability of a computing system. The method involves identifying the failure data for hardware components, software components and building a model based on it, to predict the reliability. To develop such a model, focus is given to the systems based on Open Source Software, since there is an increasing trend towards its use and only a few studies were reported on the modeling and measurement of the reliability of such products. The present work includes a thorough study on the role of Free and Open Source Software, evaluation of reliability growth models, and is trying to present an integrated model for the prediction of reliability of a computational system. The developed model has been compared with existing models and its usefulness of is being discussed.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

In this paper, we present a P2P-based database sharing system that provides information sharing capabilities through keyword-based search techniques. Our system requires neither a global schema nor schema mappings between different databases, and our keyword-based search algorithms are robust in the presence of frequent changes in the content and membership of peers. To facilitate data integration, we introduce keyword join operator to combine partial answers containing different keywords into complete answers. We also present an efficient algorithm that optimize the keyword join operations for partial answer integration. Our experimental study on both real and synthetic datasets demonstrates the effectiveness of our algorithms, and the efficiency of the proposed query processing strategies.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The purpose of this study was to test the hypothesis that soil water content would vary spatially with distance from a tree row and that the effect would differ according to tree species. A field study was conducted on a kaolinitic Oxisol in the sub-humid highlands of western Kenya to compare soil water distribution and dynamics in a maize monoculture with that under maize (Zea mays L.) intercropped with a 3-year-old tree row of Grevillea robusta A. Cunn. Ex R. Br. (grevillea) and hedgerow of Senna spectabilis DC. (senna). Soil water content was measured at weekly intervals during one cropping season using a neutron probe. Measurements were made from 20 cm to a depth of 225 cm at distances of 75, 150, 300 and 525 cm from the tree rows. The amount of water stored was greater under the sole maize crop than the agroforestry systems, especially the grevillea-maize system. Stored soil water in the grevillea-maize system increased with increasing distance from the tree row but in the senna-maize system, it decreased between 75 and 300 cm from the hedgerow. Soil water content increased least and more slowly early in the season in the grevillea-maize system, and drying was also evident as the frequency of rain declined. Soil water content at the end of the cropping season was similar to that at the start of the season in the grevillea-maize system, but about 50 and 80 mm greater in the senna-maize and sole maize systems, respectively. The seasonal water balance showed there was 140 mm, of drainage from the sole maize system. A similar amount was lost from the agroforestry systems (about 160 mm in the grevillea-maize system and 145 mm in the senna-maize system) through drainage or tree uptake. The possible benefits of reduced soil evaporation and crop transpiration close to a tree row were not evident in the grevillea-maize system, but appeared to greatly compensate for water uptake losses in the senna-maize system. Grevillea, managed as a tree row, reduced stored soil water to a greater extent than senna, managed as a hedgerow.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The financial crisis of 2007-2009 and the subsequent reaction of the G20 have created a new global regulatory landscape. Within the EU, change of regulatory institutions is ongoing. The research objective of this study is to understand how institutional changes to the EU regulatory landscape may affect corresponding institutionalized operational practices within financial organizations and to understand the role of agency within this process. Our motivation is to provide insight into these changes from an operational management perspective, as well as to test Thelen and Mahoney?s (2010) modes of institutional change. Consequently, the study researched implementations of an Investment Management System with a rules-based compliance module within financial organizations. The research consulted compliance and risk managers, as well as systems experts. The study suggests that prescriptive regulations are likely to create isomorphic configurations of rules-based compliance systems, which consequently will enable the institutionalization of associated compliance practices. The study reveals the ability of some agents within financial organizations to control the impact of regulatory institutions, not directly, but through the systems and processes they adopt to meet requirements. Furthermore, the research highlights the boundaries and relationships between each mode of change as future avenues of research.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Automatic generation of classification rules has been an increasingly popular technique in commercial applications such as Big Data analytics, rule based expert systems and decision making systems. However, a principal problem that arises with most methods for generation of classification rules is the overfit-ting of training data. When Big Data is dealt with, this may result in the generation of a large number of complex rules. This may not only increase computational cost but also lower the accuracy in predicting further unseen instances. This has led to the necessity of developing pruning methods for the simplification of rules. In addition, classification rules are used further to make predictions after the completion of their generation. As efficiency is concerned, it is expected to find the first rule that fires as soon as possible by searching through a rule set. Thus a suit-able structure is required to represent the rule set effectively. In this chapter, the authors introduce a unified framework for construction of rule based classification systems consisting of three operations on Big Data: rule generation, rule simplification and rule representation. The authors also review some existing methods and techniques used for each of the three operations and highlight their limitations. They introduce some novel methods and techniques developed by them recently. These methods and techniques are also discussed in comparison to existing ones with respect to efficient processing of Big Data.