945 resultados para decentralized attribute-based encryption
Resumo:
The main objectives of this paper are to: firstly, identify key issues related to sustainable intelligent buildings (environmental, social, economic and technological factors); develop a conceptual model for the selection of the appropriate KPIs; secondly, test critically stakeholder's perceptions and values of selected KPIs intelligent buildings; and thirdly develop a new model for measuring the level of sustainability for sustainable intelligent buildings. This paper uses a consensus-based model (Sustainable Built Environment Tool- SuBETool), which is analysed using the analytical hierarchical process (AHP) for multi-criteria decision-making. The use of the multi-attribute model for priority setting in the sustainability assessment of intelligent buildings is introduced. The paper commences by reviewing the literature on sustainable intelligent buildings research and presents a pilot-study investigating the problems of complexity and subjectivity. This study is based upon a survey perceptions held by selected stakeholders and the value they attribute to selected KPIs. It is argued that the benefit of the new proposed model (SuBETool) is a ‘tool’ for ‘comparative’ rather than an absolute measurement. It has the potential to provide useful lessons from current sustainability assessment methods for strategic future of sustainable intelligent buildings in order to improve a building's performance and to deliver objective outcomes. Findings of this survey enrich the field of intelligent buildings in two ways. Firstly, it gives a detailed insight into the selection of sustainable building indicators, as well as their degree of importance. Secondly, it tesst critically stakeholder's perceptions and values of selected KPIs intelligent buildings. It is concluded that the priority levels for selected criteria is largely dependent on the integrated design team, which includes the client, architects, engineers and facilities managers.
Resumo:
Assessment of the risk to human health posed by contaminated land may be seriously overestimated if reliant on total pollutant concentration. In vitro extraction tests, such as the physiologically based extraction test (PBET), imitate the physicochemical conditions of the human gastro-intestinal tract and offer a more practicable alternative for routine testing purposes. However, even though passage through the colon accounts for approximately 80% of the transit time through the human digestive tract and the typical contents of the colon in vivo are a carbohydrate-rich aqueous medium with the potential to promote desorption of organic pollutants, PBET comprises stomach and small intestine compartments only. Through addition of an eight-hour colon compartment to PBET and use of a carbohydrate-rich fed-state medium we demonstrated that colon-extended PBET (CE-PBET) in- creased assessments of soil-bound PAH bioaccessibility by up to 50% in laboratory soils and a factor of 4 in field soils. We attribute this increased bioaccessibility to a combination of the additional extraction time and the presence of carbohydrates in the colon compartment, both of which favor PAH desorption from soil. We propose that future assessments of the bioaccessibility of organic pollutants in soils using physiologically based extraction tests should have a colon compartment as in CE-PBET.
Resumo:
Income growth in highly industrialised countries has resulted in consumer choice of foodstuffs no longer being primarily influenced by basic factors such as price and organoleptic features. From this perspective, the present study sets out to evaluate how and to what extent consumer choice is influenced by the possible negative effects on health and environment caused by the consumption of fruit containing deposits of pesticides and chemical products. The study describes the results of a survey which explores and estimates consumer willingness to pay in two forms: a yearly contribution for the abolition of the use of pesticides on fruit, and a premium price for organically grown apples guaranteed by a certified label. The same questionnaire was administered to two samples. The first was a conventional face-to-face survey of customers of large retail outlets located around Bologna (Italy); the second was an Internet sample. The discrete choice data were analysed by means of probit and tobit models to estimate the utility consumers attribute to organically grown fruit and to a pesticide ban. The research also addresses questions of validity and representativeness as a fundamental problem in web-based surveys.
Resumo:
Currently, multi-attribute auctions are becoming widespread awarding mechanisms for contracts in construction, and in these auctions, criteria other than price are taken into account for ranking bidder proposals. Therefore, being the lowest-price bidder is no longer a guarantee of being awarded, thus increasing the importance of measuring any bidder’s performance when not only the first position (lowest price) matters. Modeling position performance allows a tender manager to calculate the probability curves related to the more likely positions to be occupied by any bidder who enters a competitive auction irrespective of the actual number of future participating bidders. This paper details a practical methodology based on simple statistical calculations for modeling the performance of a single bidder or a group of bidders, constituting a useful resource for analyzing one’s own success while benchmarking potential bidding competitors.
Resumo:
Texture is an important visual attribute used to describe the pixel organization in an image. As well as it being easily identified by humans, its analysis process demands a high level of sophistication and computer complexity. This paper presents a novel approach for texture analysis, based on analyzing the complexity of the surface generated from a texture, in order to describe and characterize it. The proposed method produces a texture signature which is able to efficiently characterize different texture classes. The paper also illustrates a novel method performance on an experiment using texture images of leaves. Leaf identification is a difficult and complex task due to the nature of plants, which presents a huge pattern variation. The high classification rate yielded shows the potential of the method, improving on traditional texture techniques, such as Gabor filters and Fourier analysis.
Resumo:
We discuss the development and performance of a low-power sensor node (hardware, software and algorithms) that autonomously controls the sampling interval of a suite of sensors based on local state estimates and future predictions of water flow. The problem is motivated by the need to accurately reconstruct abrupt state changes in urban watersheds and stormwater systems. Presently, the detection of these events is limited by the temporal resolution of sensor data. It is often infeasible, however, to increase measurement frequency due to energy and sampling constraints. This is particularly true for real-time water quality measurements, where sampling frequency is limited by reagent availability, sensor power consumption, and, in the case of automated samplers, the number of available sample containers. These constraints pose a significant barrier to the ubiquitous and cost effective instrumentation of large hydraulic and hydrologic systems. Each of our sensor nodes is equipped with a low-power microcontroller and a wireless module to take advantage of urban cellular coverage. The node persistently updates a local, embedded model of flow conditions while IP-connectivity permits each node to continually query public weather servers for hourly precipitation forecasts. The sampling frequency is then adjusted to increase the likelihood of capturing abrupt changes in a sensor signal, such as the rise in the hydrograph – an event that is often difficult to capture through traditional sampling techniques. Our architecture forms an embedded processing chain, leveraging local computational resources to assess uncertainty by analyzing data as it is collected. A network is presently being deployed in an urban watershed in Michigan and initial results indicate that the system accurately reconstructs signals of interest while significantly reducing energy consumption and the use of sampling resources. We also expand our analysis by discussing the role of this approach for the efficient real-time measurement of stormwater systems.
Resumo:
Fundação de Amparo à Pesquisa do Estado de São Paulo (FAPESP)
Resumo:
This paper adjusts decentralized OPF optimization to the AC power flow problem in power systems with interconnected areas operated by diferent transmission system operators (TSO). The proposed methodology allows finding the operation point of a particular area without explicit knowledge of network data of the other interconnected areas, being only necessary to exchange border information related to the tie-lines between areas. The methodology is based on the decomposition of the first-order optimality conditions of the AC power flow, which is formulated as a nonlinear programming problem. To allow better visualization of the concept of independent operation of each TSO, an artificial neural network have been used for computing border information of the interconnected TSOs. A multi-area Power Flow tool can be seen as a basic building block able to address a large number of problems under a multi-TSO competitive market philosophy. The IEEE RTS-96 power system is used in order to show the operation and effectiveness of the decentralized AC Power Flow. ©2010 IEEE.
Resumo:
This paper presents a new methodology for solving the optimal VAr planning problem in multi-area electric power systems, using the Dantzig-Wolfe decomposition. The original multi-area problem is decomposed into subproblems (one for each area) and a master problem (coordinator). The solution of the VAr planning problem in each area is based on the application of successive linear programming, and the coordination scheme is based on the reactive power marginal costs in the border bus. The aim of the model is to provide coordinated mechanisms to carry out the VAr planning studies maximizing autonomy and confidentiality for each area, assuring global economy to the whole system. Using the mathematical model and computational implementation of the proposed methodology, numerical results are presented for two interconnected systems, each of them composed of three equal subsystems formed by IEEE30 and IEEE118 test systems. © 2011 IEEE.
Resumo:
Structural Health Monitoring (SHM) denotes a system with the ability to detect and interpret adverse changes in a structure. One of the critical challenges for practical implementation of SHM system is the ability to detect damage under changing environmental conditions. This paper aims to characterize the temperature, load and damage effects in the sensor measurements obtained with piezoelectric transducer (PZT) patches. Data sets are collected on thin aluminum specimens under different environmental conditions and artificially induced damage states. The fuzzy clustering algorithm is used to organize the sensor measurements into a set of clusters, which can attribute the variation in sensor data due to temperature, load or any induced damage.
Resumo:
Coordenação de Aperfeiçoamento de Pessoal de Nível Superior (CAPES)
Resumo:
We explore the problem of budgeted machine learning, in which the learning algorithm has free access to the training examples’ labels but has to pay for each attribute that is specified. This learning model is appropriate in many areas, including medical applications. We present new algorithms for choosing which attributes to purchase of which examples in the budgeted learning model based on algorithms for the multi-armed bandit problem. All of our approaches outperformed the current state of the art. Furthermore, we present a new means for selecting an example to purchase after the attribute is selected, instead of selecting an example uniformly at random, which is typically done. Our new example selection method improved performance of all the algorithms we tested, both ours and those in the literature.
Resumo:
; High-resolution grain size analyses of three AMS (14)C-dated cores from the Southeastern Brazilian shelf provide a detailed record of mid- to late-Holocene environmental changes in the Southwestern Atlantic Margin. The cores exhibit millennial variability that we associate with the previously described southward shift of the Inter Tropical Convergence Zone (ITCZ) average latitudinal position over the South American continent during the Holocene climatic maximum. This generated changes in the wind-driven current system of the SW Atlantic margin and modified the grain size characteristics of the sediments deposited there. Centennial variations in the grain size are associated with a previously described late-Holocene enhancement of the El Nino-Southern Oscillation (ENSO) amplitude, which led to stronger NNE trade winds off eastern Brazil, favouring SW transport of sediments from the Paraiba do Sul River. This is recorded in a core from off Cabo Frio as a coarsening trend from 3000 cal. BP onwards. The ENSO enhancement also caused changes in precipitation and wind pattern in southern Brazil, allowing high discharge events and northward extensions of the low-saline water plume from Rio de la Plata. We propose that this resulted in a net increase in northward alongshore transport of fine sediments, seen as a prominent fine-shift at 2000 cal. BP in a core from similar to 24 degrees S on the Brazilian shelf. Wavelet-and spectral analysis of the sortable silt records show a significant similar to 1000-yr periodicity, which we attribute to solar forcing. If correct, this is one of the first indications of solar forcing of this timescale on the Southwestern Atlantic margin.
Resumo:
The continuous advancements and enhancements of wireless systems are enabling new compelling scenarios where mobile services can adapt according to the current execution context, represented by the computational resources available at the local device, current physical location, people in physical proximity, and so forth. Such services called context-aware require the timely delivery of all relevant information describing the current context, and that introduces several unsolved complexities, spanning from low-level context data transmission up to context data storage and replication into the mobile system. In addition, to ensure correct and scalable context provisioning, it is crucial to integrate and interoperate with different wireless technologies (WiFi, Bluetooth, etc.) and modes (infrastructure-based and ad-hoc), and to use decentralized solutions to store and replicate context data on mobile devices. These challenges call for novel middleware solutions, here called Context Data Distribution Infrastructures (CDDIs), capable of delivering relevant context data to mobile devices, while hiding all the issues introduced by data distribution in heterogeneous and large-scale mobile settings. This dissertation thoroughly analyzes CDDIs for mobile systems, with the main goal of achieving a holistic approach to the design of such type of middleware solutions. We discuss the main functions needed by context data distribution in large mobile systems, and we claim the precise definition and clean respect of quality-based contracts between context consumers and CDDI to reconfigure main middleware components at runtime. We present the design and the implementation of our proposals, both in simulation-based and in real-world scenarios, along with an extensive evaluation that confirms the technical soundness of proposed CDDI solutions. Finally, we consider three highly heterogeneous scenarios, namely disaster areas, smart campuses, and smart cities, to better remark the wide technical validity of our analysis and solutions under different network deployments and quality constraints.
Resumo:
Nowadays, in developed countries, the excessive food intake, in conjunction with a decreased physical activity, has led to an increase in lifestyle-related diseases, such as obesity, cardiovascular diseases, type -2 diabetes, a range of cancer types and arthritis. The socio-economic importance of such lifestyle-related diseases has encouraged countries to increase their efforts in research, and many projects have been initiated recently in research that focuses on the relationship between food and health. Thanks to these efforts and to the growing availability of technologies, the food companies are beginning to develop healthier food. The necessity of rapid and affordable methods, helping the food industries in the ingredient selection has stimulated the development of in vitro systems that simulate the physiological functions to which the food components are submitted when administrated in vivo. One of the most promising tool now available appears the in vitro digestion, which aims at predicting, in a comparative way among analogue food products, the bioaccessibility of the nutrients of interest.. The adoption of the foodomics approach has been chosen in this work to evaluate the modifications occurring during the in vitro digestion of selected protein-rich food products. The measure of the proteins breakdown was performed via NMR spectroscopy, the only techniques capable of observing, directly in the simulated gastric and duodenal fluids, the soluble oligo- and polypeptides released during the in vitro digestion process. The overall approach pioneered along this PhD work, has been discussed and promoted in a large scientific community, with specialists networked under the INFOGEST COST Action, which recently released a harmonized protocol for the in vitro digestion. NMR spectroscopy, when used in tandem with the in vitro digestion, generates a new concept, which provides an additional attribute to describe the food quality: the comparative digestibility, which measures the improvement of the nutrients bioaccessibility.