19 resultados para hierarchical porous media
em Instituto Politécnico do Porto, Portugal
Resumo:
Polyaromatic hydrocarbon (PAH) sorption to soil is a key process deciding the transport and fate of PAH, and potential toxic impacts in the soil and groundwater ecosystems, for example in connection with atmospheric PAH deposition on soils. There are numerous studies on PAH sorption in relatively low organic porous media such as urban soils and groundwater sediments, but less attention has been given to cultivated soils. In this study, the phenanthrene partition coefficient, KD (liter per kilogram), was measured on 143 cultivated Danish soils (115 topsoils, 0–0.25-m soil depth and 28 subsoils, 0.25–1-m depth) by the single-point adsorption method. The organic carbon partition coefficient, KOC (liter per kilogram) for topsoils was found generally to fall between the KOC values estimated by the two most frequently used models for PAH partitioning, the Abdul et al. (Hazardous Waste & Hazardous Materials 4(3):211– 222, 1987) model and Karickhoff et al. (Water Research 13:241–248, 1979) model. A less-recognized model by Karickhoff (Chemosphere 10:833–846, 1981), yielding a KOC of 14,918 Lkg−1, closely corresponded to the average measured KOC value for the topsoils, and this model is therefore recommended for prediction of phenanthrene mobility in cultivated topsoils. For lower subsoils (0.25–1-m depth), the KOC values were closer to and mostly below the estimate by the Abdul et al. (Hazardous Waste & Hazardous Materials 4(3):211–222, 1987) model. This implies a different organic matter composition and higher PAH sorption strength in cultivated topsoils, likely due to management effects including more rapid carbon turnover. Finally, we applied the recent Dexter et al. (Geoderma 144:620–627, 2008) theorem, and calculated the complexed organic carbon and non-complexed organic carbon fractions (COC and NCOC, grams per gram). Multiple regression analyses showed that the NCOC-based phenanthrene partition coefficient (KNCOC) could be markedly higher than the COCbased partition coefficient (KCOC) for soils with a clay/OC ratio <10. This possibly higher PAH sorption affinity to the NCOC fraction needs further investigations to develop more realistic and accurate models for PAH mobility and effects in the environment, also with regard to colloid-facilitated PAH transport.
Resumo:
The present study aimed to develop a pre-endothelialized chitosan (CH) porous hollowed scaffold for application in spinal cord regenerative therapies. CH conduits with different degrees of acetylation (DA; 4% and 15%) were prepared, characterized (microstructure, porosity and water uptake) and functionalized with a recombinant fragment of human fibronectin (rhFNIII7–10). Immobilized rhFNIII7–10 was characterized in terms of amount (125I-radiolabelling), exposure of cell-binding domains (immunofluorescence) and ability to mediate endothelial cell (EC) adhesion and cytoskeletal rearrangement. Functionalized conduits revealed a linear increase in immobilized rhFNIII7–10 with rhFNIII7–10 concentration, and, for the same concentration, higher amounts of rhFNIII7–10 on DA 4% compared with DA 15%. Moreover, rhFNIII7–10 concentrations as low as 5 and 20 lgml 1 in the coupling reaction were shown to provide DA 4% and 15% scaffolds, respectively, with levels of exposed cell-binding domains exceeding those observed on the control (DA 4% scaffolds incubated in a 20 lgml 1 human fibronectin solution). These grafting conditions proved to be effective in mediating EC adhesion/cytoskeletal organization on CH with DA 4% and 15%, without affecting the endothelial angiogenic potential. rhFNIII7–10 grafting to CH could be a strategy of particular interest in tissue engineering applications requiring the use of endothelialized porous matrices with tunable degradation rates.
Resumo:
Broadcast networks that are characterised by having different physical layers (PhL) demand some kind of traffic adaptation between segments, in order to avoid traffic congestion in linking devices. In many LANs, this problem is solved by the actual linking devices, which use some kind of flow control mechanism that either tell transmitting stations to pause (the transmission) or just discard frames. In this paper, we address the case of token-passing fieldbus networks operating in a broadcast fashion and involving message transactions over heterogeneous (wired or wireless) physical layers. For the addressed case, real-time and reliability requirements demand a different solution to the traffic adaptation problem. Our approach relies on the insertion of an appropriate idle time before a station issuing a request frame. In this way, we guarantee that the linking devices’ queues do not increase in a way that the timeliness properties of the overall system turn out to be unsuitable for the targeted applications.
Resumo:
Although power-line communication (PLC) is not a new technology, its use to support data communication with timing requirements is still the focus of ongoing research. A new infrastructure intended for communication using power lines from a central location to dispersed nodes using inexpensive devices was presented recently. This new infrastructure uses a two-level hierarchical power-line system, together with an IP-based network. Due to the master-slave behaviour of the PLC medium access, together with the inherent dynamic topology of power-line networks, a mechanism to provide end-to-end communication through the two levels of the power-line system must be provided. In this paper we introduce the architecture of the PLC protocol layer that is being implemented for this end.
Resumo:
This paper describes how MPEG-4 object based video (obv) can be used to allow selected objects to be inserted into the play-out stream to a specific user based on a profile derived for that user. The application scenario described here is for personalized product placement, and considers the value of this application in the current and evolving commercial media distribution market given the huge emphasis media distributors are currently placing on targeted advertising. This level of application of video content requires a sophisticated content description and metadata system (e.g., MPEG-7). The scenario considers the requirement for global libraries to provide the objects to be inserted into the streams. The paper then considers the commercial trading of objects between the libraries, video service providers, advertising agencies and other parties involved in the service. Consequently a brokerage of video objects is proposed based on negotiation and trading using intelligent agents representing the various parties. The proposed Media Brokerage Platform is a multi-agent system structured in two layers. In the top layer, there is a collection of coarse grain agents representing the real world players – the providers and deliverers of media contents and the market regulator profiler – and, in the bottom layer, there is a set of finer grain agents constituting the marketplace – the delegate agents and the market agent. For knowledge representation (domain, strategic and negotiation protocols) we propose a Semantic Web approach based on ontologies. The media components contents should be represented in MPEG-7 and the metadata describing the objects to be traded should follow a specific ontology. The top layer content providers and deliverers are modelled by intelligent autonomous agents that express their will to transact – buy or sell – media components by registering at a service registry. The market regulator profiler creates, according to the selected profile, a market agent, which, in turn, checks the service registry for potential trading partners for a given component and invites them for the marketplace. The subsequent negotiation and actual transaction is performed by delegate agents in accordance with their profiles and the predefined rules of the market.
Resumo:
This paper proposes a novel business model to support media content personalisation: an agent-based business-to-business (B2B) brokerage platform for media content producer and distributor businesses. Distributors aim to provide viewers with a personalised content experience and producers wish to en-sure that their media objects are watched by as many targeted viewers as possible. In this scenario viewers and media objects (main programmes and candidate objects for insertion) have profiles and, in the case of main programme objects, are annotated with placeholders representing personalisation opportunities, i.e., locations for insertion of personalised media objects. The MultiMedia Brokerage (MMB) platform is a multiagent multilayered brokerage composed by agents that act as sellers and buyers of viewer stream timeslots and/or media objects on behalf of the registered businesses. These agents engage in negotiations to select the media objects that best match the current programme and viewer profiles.
Resumo:
Composition is a practice of key importance in software engineering. When real-time applications are composed it is necessary that their timing properties (such as meeting the deadlines) are guaranteed. The composition is performed by establishing an interface between the application and the physical platform. Such an interface does typically contain information about the amount of computing capacity needed by the application. In multiprocessor platforms, the interface should also present information about the degree of parallelism. Recently there have been quite a few interface proposals. However, they are either too complex to be handled or too pessimistic.In this paper we propose the Generalized Multiprocessor Periodic Resource model (GMPR) that is strictly superior to the MPR model without requiring a too detailed description. We describe a method to generate the interface from the application specification. All these methods have been implemented in Matlab routines that are publicly available.
Resumo:
Consider a single processor and a software system. The software system comprises components and interfaces where each component has an associated interface and each component comprises a set of constrained-deadline sporadic tasks. A scheduling algorithm (called global scheduler) determines at each instant which component is active. The active component uses another scheduling algorithm (called local scheduler) to determine which task is selected for execution on the processor. The interface of a component makes certain information about a component visible to other components; the interfaces of all components are used for schedulability analysis. We address the problem of generating an interface for a component based on the tasks inside the component. We desire to (i) incur only a small loss in schedulability analysis due to the interface and (ii) ensure that the amount of space (counted in bits) of the interface is small; this is because such an interface hides as much details of the component as possible. We present an algorithm for generating such an interface.
Resumo:
Wireless Sensor Networks (WSNs) are highly distributed systems in which resource allocation (bandwidth, memory) must be performed efficiently to provide a minimum acceptable Quality of Service (QoS) to the regions where critical events occur. In fact, if resources are statically assigned independently from the location and instant of the events, these resources will definitely be misused. In other words, it is more efficient to dynamically grant more resources to sensor nodes affected by critical events, thus providing better network resource management and reducing endto- end delays of event notification and tracking. In this paper, we discuss the use of a WSN management architecture based on the active network management paradigm to provide the real-time tracking and reporting of dynamic events while ensuring efficient resource utilization. The active network management paradigm allows packets to transport not only data, but also program scripts that will be executed in the nodes to dynamically modify the operation of the network. This presumes the use of a runtime execution environment (middleware) in each node to interpret the script. We consider hierarchical (e.g. cluster-tree, two-tiered architecture) WSN topologies since they have been used to improve the timing performance of WSNs as they support deterministic medium access control protocols.
Resumo:
A significant number of process control and factory automation systems use PROFIBUS as the underlying fieldbus communication network. The process of properly setting up a PROFIBUS network is not a straightforward task. In fact, a number of network parameters must be set for guaranteeing the required levels of timeliness and dependability. Engineering PROFIBUS networks is even more subtle when the network includes various physical segments exhibiting heterogeneous specifications, such as bus speed or frame formats, just to mention a few. In this paper we provide underlying theory and a methodology to guarantee the proper operation of such type of heterogeneous PROFIBUS networks. We additionally show how the methodology can be applied to the practical case of PROFIBUS networks containing simultaneously DP (Decentralised Periphery) and PA (Process Automation) segments, two of the most used commercial-off-the-shelf (COTS) PROFIBUS solutions. The importance of the findings is however not limited to this case. The proposed methodology can be generalised to cover other heterogeneous infrastructures. Hybrid wired/wireless solutions are just an example for which an enormous eagerness exists.
Resumo:
Media content personalisation is a major challenge involving viewers as well as media content producer and distributor businesses. The goal is to provide viewers with media items aligned with their interests. Producers and distributors engage in item negotiations to establish the corresponding service level agreements (SLA). In order to address automated partner lookup and item SLA negotiation, this paper proposes the MultiMedia Brokerage (MMB) platform, which is a multiagent system that negotiates SLA regarding media items on behalf of media content producer and distributor businesses. The MMB platform is structured in four service layers: interface, agreement management, business modelling and market. In this context, there are: (i) brokerage SLA (bSLA), which are established between individual businesses and the platform regarding the provision of brokerage services; and (ii) item SLA (iSLA), which are established between producer and distributor businesses about the provision of media items. In particular, this paper describes the negotiation, establishment and enforcement of bSLA and iSLA, which occurs at the agreement and negotiation layers, respectively. The platform adopts a pay-per-use business model where the bSLA define the general conditions that apply to the related iSLA. To illustrate this process, we present a case study describing the negotiation of a bSLA instance and several related iSLA instances. The latter correspond to the negotiation of the Electronic Program Guide (EPG) for a specific end viewer.
Resumo:
Near real time media content personalisation is nowadays a major challenge involving media content sources, distributors and viewers. This paper describes an approach to seamless recommendation, negotiation and transaction of personalised media content. It adopts an integrated view of the problem by proposing, on the business-to-business (B2B) side, a brokerage platform to negotiate the media items on behalf of the media content distributors and sources, providing viewers, on the business-to-consumer (B2C) side, with a personalised electronic programme guide (EPG) containing the set of recommended items after negotiation. In this setup, when a viewer connects, the distributor looks up and invites sources to negotiate the contents of the viewer personal EPG. The proposed multi-agent brokerage platform is structured in four layers, modelling the registration, service agreement, partner lookup, invitation as well as item recommendation, negotiation and transaction stages of the B2B processes. The recommendation service is a rule-based switch hybrid filter, including six collaborative and two content-based filters. The rule-based system selects, at runtime, the filter(s) to apply as well as the final set of recommendations to present. The filter selection is based on the data available, ranging from the history of items watched to the ratings and/or tags assigned to the items by the viewer. Additionally, this module implements (i) a novel item stereotype to represent newly arrived items, (ii) a standard user stereotype for new users, (iii) a novel passive user tag cloud stereotype for socially passive users, and (iv) a new content-based filter named the collinearity and proximity similarity (CPS). At the end of the paper, we present off-line results and a case study describing how the recommendation service works. The proposed system provides, to our knowledge, an excellent holistic solution to the problem of recommending multimedia contents.
Resumo:
Simpósio de Informática (INForum 2015), Covilhã, Portugal. Notes: Best paper award nominee.
Resumo:
The nomination of Guimarães to host the 2012 European Capital of Culture (ECC) has put on the agenda of the city the need of measuring the effects that the implementation of this mega event could have in it and in the municipality a whole. The balance of the benefits and costs and an extended community involvement tend to reduce negative impacts and enhance positive ones. This chapter analyzes the involvement of population and local associations in the planning and organization of the 2012 Guimarães European Capital of Culture, using the coverage made during 2011 by local and national press of the mega event. A content analysis of the news published covering the period between January and December 2011 and using three newspapers was conducted. From those, two were local and weekly newspapers and one was a national daily one. Looking to data results, it can be concluded that it was poor the community involvement and, also, the one of the cultural associations in the organizations of the 2012 ECC. A strong negative reaction to the model choose to plan the mega event conducted by official organizers was found, which has cast doubts on the desirable participation of the residents and, consequently, on the success of the mega event, especially in a perspective of a medium and long term effects.