942 resultados para systems integration


Relevância:

30.00% 30.00%

Publicador:

Resumo:

Environmental issues, including global warming, have been serious challenges realized worldwide, and they have become particularly important for the iron and steel manufacturers during the last decades. Many sites has been shut down in developed countries due to environmental regulation and pollution prevention while a large number of production plants have been established in developing countries which has changed the economy of this business. Sustainable development is a concept, which today affects economic growth, environmental protection, and social progress in setting up the basis for future ecosystem. A sustainable headway may attempt to preserve natural resources, recycle and reuse materials, prevent pollution, enhance yield and increase profitability. To achieve these objectives numerous alternatives should be examined in the sustainable process design. Conventional engineering work cannot address all of these substitutes effectively and efficiently to find an optimal route of processing. A systematic framework is needed as a tool to guide designers to make decisions based on overall concepts of the system, identifying the key bottlenecks and opportunities, which lead to an optimal design and operation of the systems. Since the 1980s, researchers have made big efforts to develop tools for what today is referred to as Process Integration. Advanced mathematics has been used in simulation models to evaluate various available alternatives considering physical, economic and environmental constraints. Improvements on feed material and operation, competitive energy market, environmental restrictions and the role of Nordic steelworks as energy supplier (electricity and district heat) make a great motivation behind integration among industries toward more sustainable operation, which could increase the overall energy efficiency and decrease environmental impacts. In this study, through different steps a model is developed for primary steelmaking, with the Finnish steel sector as a reference, to evaluate future operation concepts of a steelmaking site regarding sustainability. The research started by potential study on increasing energy efficiency and carbon dioxide reduction due to integration of steelworks with chemical plants for possible utilization of available off-gases in the system as chemical products. These off-gases from blast furnace, basic oxygen furnace and coke oven furnace are mainly contained of carbon monoxide, carbon dioxide, hydrogen, nitrogen and partially methane (in coke oven gas) and have proportionally low heating value but are currently used as fuel within these industries. Nonlinear optimization technique is used to assess integration with methanol plant under novel blast furnace technologies and (partially) substitution of coal with other reducing agents and fuels such as heavy oil, natural gas and biomass in the system. Technical aspect of integration and its effect on blast furnace operation regardless of capital expenditure of new operational units are studied to evaluate feasibility of the idea behind the research. Later on the concept of polygeneration system added and a superstructure generated with alternative routes for off-gases pretreatment and further utilization on a polygeneration system producing electricity, district heat and methanol. (Vacuum) pressure swing adsorption, membrane technology and chemical absorption for gas separation; partial oxidation, carbon dioxide and steam methane reforming for methane gasification; gas and liquid phase methanol synthesis are the main alternative process units considered in the superstructure. Due to high degree of integration in process synthesis, and optimization techniques, equation oriented modeling is chosen as an alternative and effective strategy to previous sequential modelling for process analysis to investigate suggested superstructure. A mixed integer nonlinear programming is developed to study behavior of the integrated system under different economic and environmental scenarios. Net present value and specific carbon dioxide emission is taken to compare economic and environmental aspects of integrated system respectively for different fuel systems, alternative blast furnace reductants, implementation of new blast furnace technologies, and carbon dioxide emission penalties. Sensitivity analysis, carbon distribution and the effect of external seasonal energy demand is investigated with different optimization techniques. This tool can provide useful information concerning techno-environmental and economic aspects for decision-making and estimate optimal operational condition of current and future primary steelmaking under alternative scenarios. The results of the work have demonstrated that it is possible in the future to develop steelmaking towards more sustainable operation.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Digital business ecosystems (DBE) are becoming an increasingly popular concept for modelling and building distributed systems in heterogeneous, decentralized and open environments. Information- and communication technology (ICT) enabled business solutions have created an opportunity for automated business relations and transactions. The deployment of ICT in business-to-business (B2B) integration seeks to improve competitiveness by establishing real-time information and offering better information visibility to business ecosystem actors. The products, components and raw material flows in supply chains are traditionally studied in logistics research. In this study, we expand the research to cover the processes parallel to the service and information flows as information logistics integration. In this thesis, we show how better integration and automation of information flows enhance the speed of processes and, thus, provide cost savings and other benefits for organizations. Investments in DBE are intended to add value through business automation and are key decisions in building up information logistics integration. Business solutions that build on automation are important sources of value in networks that promote and support business relations and transactions. Value is created through improved productivity and effectiveness when new, more efficient collaboration methods are discovered and integrated into DBE. Organizations, business networks and collaborations, even with competitors, form DBE in which information logistics integration has a significant role as a value driver. However, traditional economic and computing theories do not focus on digital business ecosystems as a separate form of organization, and they do not provide conceptual frameworks that can be used to explore digital business ecosystems as value drivers—combined internal management and external coordination mechanisms for information logistics integration are not the current practice of a company’s strategic process. In this thesis, we have developed and tested a framework to explore the digital business ecosystems developed and a coordination model for digital business ecosystem integration; moreover, we have analysed the value of information logistics integration. The research is based on a case study and on mixed methods, in which we use the Delphi method and Internetbased tools for idea generation and development. We conducted many interviews with key experts, which we recoded, transcribed and coded to find success factors. Qualitative analyses were based on a Monte Carlo simulation, which sought cost savings, and Real Option Valuation, which sought an optimal investment program for the ecosystem level. This study provides valuable knowledge regarding information logistics integration by utilizing a suitable business process information model for collaboration. An information model is based on the business process scenarios and on detailed transactions for the mapping and automation of product, service and information flows. The research results illustrate the current cap of understanding information logistics integration in a digital business ecosystem. Based on success factors, we were able to illustrate how specific coordination mechanisms related to network management and orchestration could be designed. We also pointed out the potential of information logistics integration in value creation. With the help of global standardization experts, we utilized the design of the core information model for B2B integration. We built this quantitative analysis by using the Monte Carlo-based simulation model and the Real Option Value model. This research covers relevant new research disciplines, such as information logistics integration and digital business ecosystems, in which the current literature needs to be improved. This research was executed by high-level experts and managers responsible for global business network B2B integration. However, the research was dominated by one industry domain, and therefore a more comprehensive exploration should be undertaken to cover a larger population of business sectors. Based on this research, the new quantitative survey could provide new possibilities to examine information logistics integration in digital business ecosystems. The value activities indicate that further studies should continue, especially with regard to the collaboration issues on integration, focusing on a user-centric approach. We should better understand how real-time information supports customer value creation by imbedding the information into the lifetime value of products and services. The aim of this research was to build competitive advantage through B2B integration to support a real-time economy. For practitioners, this research created several tools and concepts to improve value activities, information logistics integration design and management and orchestration models. Based on the results, the companies were able to better understand the formulation of the digital business ecosystem and the importance of joint efforts in collaboration. However, the challenge of incorporating this new knowledge into strategic processes in a multi-stakeholder environment remains. This challenge has been noted, and new projects have been established in pursuit of a real-time economy.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Nowadays, computer-based systems tend to become more complex and control increasingly critical functions affecting different areas of human activities. Failures of such systems might result in loss of human lives as well as significant damage to the environment. Therefore, their safety needs to be ensured. However, the development of safety-critical systems is not a trivial exercise. Hence, to preclude design faults and guarantee the desired behaviour, different industrial standards prescribe the use of rigorous techniques for development and verification of such systems. The more critical the system is, the more rigorous approach should be undertaken. To ensure safety of a critical computer-based system, satisfaction of the safety requirements imposed on this system should be demonstrated. This task involves a number of activities. In particular, a set of the safety requirements is usually derived by conducting various safety analysis techniques. Strong assurance that the system satisfies the safety requirements can be provided by formal methods, i.e., mathematically-based techniques. At the same time, the evidence that the system under consideration meets the imposed safety requirements might be demonstrated by constructing safety cases. However, the overall safety assurance process of critical computerbased systems remains insufficiently defined due to the following reasons. Firstly, there are semantic differences between safety requirements and formal models. Informally represented safety requirements should be translated into the underlying formal language to enable further veri cation. Secondly, the development of formal models of complex systems can be labour-intensive and time consuming. Thirdly, there are only a few well-defined methods for integration of formal verification results into safety cases. This thesis proposes an integrated approach to the rigorous development and verification of safety-critical systems that (1) facilitates elicitation of safety requirements and their incorporation into formal models, (2) simplifies formal modelling and verification by proposing specification and refinement patterns, and (3) assists in the construction of safety cases from the artefacts generated by formal reasoning. Our chosen formal framework is Event-B. It allows us to tackle the complexity of safety-critical systems as well as to structure safety requirements by applying abstraction and stepwise refinement. The Rodin platform, a tool supporting Event-B, assists in automatic model transformations and proof-based verification of the desired system properties. The proposed approach has been validated by several case studies from different application domains.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Due to various advantages such as flexibility, scalability and updatability, software intensive systems are increasingly embedded in everyday life. The constantly growing number of functions executed by these systems requires a high level of performance from the underlying platform. The main approach to incrementing performance has been the increase of operating frequency of a chip. However, this has led to the problem of power dissipation, which has shifted the focus of research to parallel and distributed computing. Parallel many-core platforms can provide the required level of computational power along with low power consumption. On the one hand, this enables parallel execution of highly intensive applications. With their computational power, these platforms are likely to be used in various application domains: from home use electronics (e.g., video processing) to complex critical control systems. On the other hand, the utilization of the resources has to be efficient in terms of performance and power consumption. However, the high level of on-chip integration results in the increase of the probability of various faults and creation of hotspots leading to thermal problems. Additionally, radiation, which is frequent in space but becomes an issue also at the ground level, can cause transient faults. This can eventually induce a faulty execution of applications. Therefore, it is crucial to develop methods that enable efficient as well as resilient execution of applications. The main objective of the thesis is to propose an approach to design agentbased systems for many-core platforms in a rigorous manner. When designing such a system, we explore and integrate various dynamic reconfiguration mechanisms into agents functionality. The use of these mechanisms enhances resilience of the underlying platform whilst maintaining performance at an acceptable level. The design of the system proceeds according to a formal refinement approach which allows us to ensure correct behaviour of the system with respect to postulated properties. To enable analysis of the proposed system in terms of area overhead as well as performance, we explore an approach, where the developed rigorous models are transformed into a high-level implementation language. Specifically, we investigate methods for deriving fault-free implementations from these models into, e.g., a hardware description language, namely VHDL.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

In recent years, technological advancements in microelectronics and sensor technologies have revolutionized the field of electrical engineering. New manufacturing techniques have enabled a higher level of integration that has combined sensors and electronics into compact and inexpensive systems. Previously, the challenge in measurements was to understand the operation of the electronics and sensors, but this has now changed. Nowadays, the challenge in measurement instrumentation lies in mastering the whole system, not just the electronics. To address this issue, this doctoral dissertation studies whether it would be beneficial to consider a measurement system as a whole from the physical phenomena to the digital recording device, where each piece of the measurement system affects the system performance, rather than as a system consisting of small independent parts such as a sensor or an amplifier that could be designed separately. The objective of this doctoral dissertation is to describe in depth the development of the measurement system taking into account the challenges caused by the electrical and mechanical requirements and the measurement environment. The work is done as an empirical case study in two example applications that are both intended for scientific studies. The cases are a light sensitive biological sensor used in imaging and a gas electron multiplier detector for particle physics. The study showed that in these two cases there were a number of different parts of the measurement system that interacted with each other. Without considering these interactions, the reliability of the measurement may be compromised, which may lead to wrong conclusions about the measurement. For this reason it is beneficial to conceptualize the measurement system as a whole from the physical phenomena to the digital recording device where each piece of the measurement system affects the system performance. The results work as examples of how a measurement system can be successfully constructed to support a study of sensors and electronics.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Resilience is the property of a system to remain trustworthy despite changes. Changes of a different nature, whether due to failures of system components or varying operational conditions, significantly increase the complexity of system development. Therefore, advanced development technologies are required to build robust and flexible system architectures capable of adapting to such changes. Moreover, powerful quantitative techniques are needed to assess the impact of these changes on various system characteristics. Architectural flexibility is achieved by embedding into the system design the mechanisms for identifying changes and reacting on them. Hence a resilient system should have both advanced monitoring and error detection capabilities to recognise changes as well as sophisticated reconfiguration mechanisms to adapt to them. The aim of such reconfiguration is to ensure that the system stays operational, i.e., remains capable of achieving its goals. Design, verification and assessment of the system reconfiguration mechanisms is a challenging and error prone engineering task. In this thesis, we propose and validate a formal framework for development and assessment of resilient systems. Such a framework provides us with the means to specify and verify complex component interactions, model their cooperative behaviour in achieving system goals, and analyse the chosen reconfiguration strategies. Due to the variety of properties to be analysed, such a framework should have an integrated nature. To ensure the system functional correctness, it should rely on formal modelling and verification, while, to assess the impact of changes on such properties as performance and reliability, it should be combined with quantitative analysis. To ensure scalability of the proposed framework, we choose Event-B as the basis for reasoning about functional correctness. Event-B is a statebased formal approach that promotes the correct-by-construction development paradigm and formal verification by theorem proving. Event-B has a mature industrial-strength tool support { the Rodin platform. Proof-based verification as well as the reliance on abstraction and decomposition adopted in Event-B provides the designers with a powerful support for the development of complex systems. Moreover, the top-down system development by refinement allows the developers to explicitly express and verify critical system-level properties. Besides ensuring functional correctness, to achieve resilience we also need to analyse a number of non-functional characteristics, such as reliability and performance. Therefore, in this thesis we also demonstrate how formal development in Event-B can be combined with quantitative analysis. Namely, we experiment with integration of such techniques as probabilistic model checking in PRISM and discrete-event simulation in SimPy with formal development in Event-B. Such an integration allows us to assess how changes and di erent recon guration strategies a ect the overall system resilience. The approach proposed in this thesis is validated by a number of case studies from such areas as robotics, space, healthcare and cloud domain.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The research presented is a qualitative case study of educators’ experiences in integrating living skills in the context of health and physical education (HPE). In using semi-structured interviews the study investigated HPE educators’ experiences and revealed their insights relative to three major themes; professional practice, challenges and support systems. Professional practice experiences detailed the use of progressive lesson planning, reflective and engaging activities, explicit student centered pedagogy as well as holistic teaching philosophies. Even further, the limited knowledge and awareness of living skills, conflicting teaching philosophies, competitive environments between subject areas and lack of time and accessibility were four major challenges that emerged throughout the data. Major supportive roles for HPE educators in the integration process included other educators, consultants, school administration, public health, parents, community programs and professional organizations. The study provides valuable discussion and suggestions for improvement of pedagogical practices in teaching living skills in the HPE setting.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

A packed bed bioreactor (PBBR) was developed for rapid establishment of nitrification in brackish water hatchery systems in the tropics. The reactors were activated by immobilizing ammonia-oxidizing (AMONPCU- 1) and nitrite-oxidizing (NIONPCU-1) bacterial consortia on polystyrene and low-density polyethylene beads, respectively. Fluorescence in situ hybridization demonstrated the presence of autotrophic nitrifiers belong to Nitrosococcus mobilis, lineage of b ammonia oxidizers and nitrite oxidizer Nitrobacter sp. in the consortia. The activated reactors upon integration to the hatchery system resulted in significant ammonia removal (P\0.01) culminating to its undetectable levels. Consequently, a significantly higher percent survival of larvae was observed in the larval production systems. With spent water the reactors could establish nitrification with high percentage removal of ammonia (78%), nitrite (79%) and BOD (56%) within 7 days of initiation of the process. PBBR is configured in such a way to minimize the energy requirements for continuous operation by limiting the energy inputs to a single stage pumping of water and aeration to the aeration cells. The PBBR shall enable hatchery systems to operate under closed recirculating mode and pave the way for better water management in the aquaculture industry.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Lead free magneto electrics with a strong sub resonant (broad frequency range) magneto electric coupling coefficient (MECC) is the goal of the day which can revolutionise the microelectronics and microelectromechanical systems (MEMS) industry. We report giant resonant MECC in lead free nanograined Barium Titanate–CoFe (Alloy)-Barium Titanate [BTO-CoFe-BTO] sandwiched thin films. The resonant MECC values obtained here are the highest values recorded in thin films/ multilayers. Sub-resonant MECC values are quite comparable to the highest MECC reported in 2-2 layered structures. MECC got enhanced by two orders at a low frequency resonance. The results show the potential of these thin films for transducer, magnetic field assisted energy harvesters, switching devices, and storage applications. Some possible device integration techniques are also discussed

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Biometrics is an efficient technology with great possibilities in the area of security system development for official and commercial applications. The biometrics has recently become a significant part of any efficient person authentication solution. The advantage of using biometric traits is that they cannot be stolen, shared or even forgotten. The thesis addresses one of the emerging topics in Authentication System, viz., the implementation of Improved Biometric Authentication System using Multimodal Cue Integration, as the operator assisted identification turns out to be tedious, laborious and time consuming. In order to derive the best performance for the authentication system, an appropriate feature selection criteria has been evolved. It has been seen that the selection of too many features lead to the deterioration in the authentication performance and efficiency. In the work reported in this thesis, various judiciously chosen components of the biometric traits and their feature vectors are used for realizing the newly proposed Biometric Authentication System using Multimodal Cue Integration. The feature vectors so generated from the noisy biometric traits is compared with the feature vectors available in the knowledge base and the most matching pattern is identified for the purpose of user authentication. In an attempt to improve the success rate of the Feature Vector based authentication system, the proposed system has been augmented with the user dependent weighted fusion technique.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Parasitic weeds of the genera Striga, Orobanche, and Phelipanche pose a severe problem for agriculture because they are difficult to control and are highly destructive to several crops. The present work was carried out during the period October, 2009 to February, 2012 to evaluate the potential of arbuscular mycorrhizal fungi (AMF) to suppress P. ramosa on tomatoes and to investigate the effects of air-dried powder and aqueous extracts from Euphorbia hirta on germination and haustorium initiation in Phelipanche ramosa. The work was divided into three parts: a survey of the indigenous mycorrhizal flora in Sudan, second, laboratory and greenhouse experiments (conducted in Germany and Sudan) to construct a base for the third part, which was a field trial in Sudan. A survey was performed in 2009 in the White Nile state, Sudan to assess AMF spore densities and root colonization in nine fields planted with 13 different important agricultural crops. In addition, an attempt was made to study the relationship between soil physico-chemical properties and AMF spore density, colonization rate, species richness and other diversity indices. The mean percentage of AMF colonization was 34%, ranging from 19-50%. The spore densities (expressed as per 100 g dry soil) retrieved from the rhizosphere of different crops were relatively high, varying from 344 to 1222 with a mean of 798. There was no correlation between spore densities in soil and root colonization percentage. A total of 45 morphologically classifiable species representing ten genera of AMF were detected with no correlation between the number of species found in a soil sample and the spore density. The most abundant genus was Glomus (20 species). The AMF diversity expressed by the Shannon–Weaver index was highest in sorghum (H\= 2.27) and Jews mallow (H\= 2.13) and lowest in alfalfa (H\= 1.4). With respect to crop species, the genera Glomus and Entrophospora were encountered in almost all crops, except for Entrophospora in alfalfa. Kuklospora was found only in sugarcane and sorghum. The genus Ambispora was recovered only in mint and okra, while mint and onion were the only species on which no Acaulospora was found. The hierarchical cluster analysis based on the similarity among AMF communities with respect to crop species overall showed that species compositions were relatively similar with the highest dissimilarity of about 25% separating three of the mango samples and the four sorghum samples from all other samples. Laboratory experiments studied the influence of root and stem exudates of three tomato varieties infected by three different Glomus species on germination of P. ramosa. Root exudates were collected 21or 42 days after transplanting (DAT) and stem exudates 42 DAT and tested for their effects on germination of P. ramosa seeds in vitro. The tomato varieties studied did not have an effect on either mycorrhizal colonization or Phelipanche germination. Germination in response to exudates from 42 day old mycorrhizal plants was significantly reduced in comparison to non-mycorrhizal controls. Germination of P. ramosa in response to root exudates from 21 day old plants was consistently higher than for 42 day-old plants (F=121.6; P<.0001). Stem diffusates from non-mycorrhizal plants invariably elicited higher germination than diffusates from the corresponding mycorrhizal ones and differences were mostly statistically significant. A series of laboratory experiments was undertaken to investigate the effects of aqueous extracts from Euphorbia hirta on germination, radicle elongation, and haustorium initiation in P. ramosa. P. ramosa seeds conditioned in water and subsequently treated with diluted E. hirta extract (10-25% v/v) displayed considerable germination (47-62%). Increasing extract concentration to 50% or more reduced germination in response to the synthetic germination stimulants GR24 and Nijmegen-1 in a concentration dependent manner. P. ramosa germlings treated with diluted Euphorbia extract (10-75 % v/v) displayed haustorium initiation comparable to 2, 5-Dimethoxy-p-benzoquinon (DMBQ) at 20 µM. Euphorbia extract applied during conditioning reduced haustorium initiation in a concentration dependent manner. E. hirta extract or air-dried powder, applied to soil, induced considerable P. ramosa germination. Pot experiments were undertaken in a glasshouse at the University of Kassel, Germany, to investigate the effects of P. ramosa seed bank on tomato growth parameters. Different Phelipanche seed banks were established by mixing the parasite seeds (0 - 32 mg) with the potting medium in each pot. P. ramosa reduced all tomato growth parameters measured and the reduction progressively increased with seed bank. Root and total dry matter accumulation per tomato plant were most affected. P. ramosa emergence, number of tubercles, and tubercle dry weight increased with the seed bank and were, invariably, maximal with the highest seed bank. Another objective was to determine if different AM fungi differ in their effects on the colonization of tomatoes with P. ramosa and the performance of P. ramosa after colonization. Three AMF species viz. GIomus intraradices, Glomus mosseae and Glomus Sprint® were used in this study. For the infection, P. ramosa seeds (8 mg) were mixed with the top 5 cm soil in each pot. No mycorrhizal colonization was detected in un-inoculated control plants. P. ramosa infested, mycorrhiza inoculated tomato plants had significantly lower AMF colonization compared to plants not infested with P. ramosa. Inoculation with G. intraradices, G. mosseae and Glomus Sprint® reduced the number of emerged P. ramosa plants by 29.3, 45.3 and 62.7% and the number of tubercles by 22.2, 42 and 56.8%, respectively. Mycorrhizal root colonization was positively correlated with number of branches and total dry matter of tomatoes. Field experiments on tomato undertaken in 2010/12 were only partially successful because of insect infestations which resulted in the complete destruction of the second run of the experiment. The effects of the inoculation with AMF, the addition of 10 t ha-1 filter mud (FM), an organic residues from sugar processing and 36 or 72 kg N ha-1 on the infestation of tomatoes with P. ramosa were assessed. In un-inoculated control plants, AMF colonization ranged between 13.4 to 22.1% with no significant differences among FM and N treatments. Adding AMF or FM resulted in a significant increase of branching in the tomato plants with no additive effects. Dry weights were slightly increased through FM application when no N was applied and significantly at 36 kg N ha-1. There was no effect of FM on the time until the first Phelipanche emerged while AMF and N application interacted. Especially AMF inoculation resulted in a tendency to delayed P. ramosa emergence. The marketable yield was extremely low due to the strong fruit infestation with insects mainly whitefly Bemisia tabaci and tomato leaf miner (Tuta absoluta). Tomatoes inoculated with varied mycorrhiza species displayed different response to the insect infestation, as G. intraradices significantly reduced the infestation, while G. mosseae elicited higher insect infestation. The results of the present thesis indicate that there may be a potential of developing management strategies for P. ramosa targeting the pre-attachment stage namely germination and haustorial initiation using plant extracts. However, ways of practical use need to be developed. If such treatments can be combined with AMF inoculation also needs to be investigated. Overall, it will require a systematic approach to develop management tools that are easily applicable and affordable to Sudanese farmers. It is well-known that proper agronomical practices such as the design of an optimum crop rotation in cropping systems, reduced tillage, promotion of cover crops, the introduction of multi-microbial inoculants, and maintenance of proper phosphorus levels are advantageous if the mycorrhiza protection method is exploited against Phelipanche ramosa infestation. Without the knowledge about the biology of the parasitic weeds by the farmers and basic preventive measures such as hygiene and seed quality control no control strategy will be successful, however.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Integration of inputs by cortical neurons provides the basis for the complex information processing performed in the cerebral cortex. Here, we propose a new analytic framework for understanding integration within cortical neuronal receptive fields. Based on the synaptic organization of cortex, we argue that neuronal integration is a systems--level process better studied in terms of local cortical circuitry than at the level of single neurons, and we present a method for constructing self-contained modules which capture (nonlinear) local circuit interactions. In this framework, receptive field elements naturally have dual (rather than the traditional unitary influence since they drive both excitatory and inhibitory cortical neurons. This vector-based analysis, in contrast to scalarsapproaches, greatly simplifies integration by permitting linear summation of inputs from both "classical" and "extraclassical" receptive field regions. We illustrate this by explaining two complex visual cortical phenomena, which are incompatible with scalar notions of neuronal integration.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Each player in the financial industry, each bank, stock exchange, government agency, or insurance company operates its own financial information system or systems. By its very nature, financial information, like the money that it represents, changes hands. Therefore the interoperation of financial information systems is the cornerstone of the financial services they support. E-services frameworks such as web services are an unprecedented opportunity for the flexible interoperation of financial systems. Naturally the critical economic role and the complexity of financial information led to the development of various standards. Yet standards alone are not the panacea: different groups of players use different standards or different interpretations of the same standard. We believe that the solution lies in the convergence of flexible E-services such as web-services and semantically rich meta-data as promised by the semantic Web; then a mediation architecture can be used for the documentation, identification, and resolution of semantic conflicts arising from the interoperation of heterogeneous financial services. In this paper we illustrate the nature of the problem in the Electronic Bill Presentment and Payment (EBPP) industry and the viability of the solution we propose. We describe and analyze the integration of services using four different formats: the IFX, OFX and SWIFT standards, and an example proprietary format. To accomplish this integration we use the COntext INterchange (COIN) framework. The COIN architecture leverages a model of sources and receivers’ contexts in reference to a rich domain model or ontology for the description and resolution of semantic heterogeneity.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

In this paper, we present a P2P-based database sharing system that provides information sharing capabilities through keyword-based search techniques. Our system requires neither a global schema nor schema mappings between different databases, and our keyword-based search algorithms are robust in the presence of frequent changes in the content and membership of peers. To facilitate data integration, we introduce keyword join operator to combine partial answers containing different keywords into complete answers. We also present an efficient algorithm that optimize the keyword join operations for partial answer integration. Our experimental study on both real and synthetic datasets demonstrates the effectiveness of our algorithms, and the efficiency of the proposed query processing strategies.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

In this paper, we consider the ATM networks in which the virtual path concept is implemented. The question of how to multiplex two or more diverse traffic classes while providing different quality of service requirements is a very complicated open problem. Two distinct options are available: integration and segregation. In an integration approach all the traffic from different connections are multiplexed onto one VP. This implies that the most restrictive QOS requirements must be applied to all services. Therefore, link utilization will be decreased because unnecessarily stringent QOS is provided to all connections. With the segregation approach the problem can be much simplified if different types of traffic are separated by assigning a VP with dedicated resources (buffers and links). Therefore, resources may not be efficiently utilized because no sharing of bandwidth can take place across the VP. The probability that the bandwidth required by the accepted connections exceeds the capacity of the link is evaluated with the probability of congestion (PC). Since the PC can be expressed as the CLP, we shall simply carry out bandwidth allocation using the PC. We first focus on the influence of some parameters (CLP, bit rate and burstiness) on the capacity required by a VP supporting a single traffic class using the new convolution approach. Numerical results are presented both to compare the required capacity and to observe which conditions under each approach are preferred