30 resultados para Correlation based analysis
Resumo:
The current industry trend is towards using Commercially available Off-The-Shelf (COTS) based multicores for developing real time embedded systems, as opposed to the usage of custom-made hardware. In typical implementation of such COTS-based multicores, multiple cores access the main memory via a shared bus. This often leads to contention on this shared channel, which results in an increase of the response time of the tasks. Analyzing this increased response time, considering the contention on the shared bus, is challenging on COTS-based systems mainly because bus arbitration protocols are often undocumented and the exact instants at which the shared bus is accessed by tasks are not explicitly controlled by the operating system scheduler; they are instead a result of cache misses. This paper makes three contributions towards analyzing tasks scheduled on COTS-based multicores. Firstly, we describe a method to model the memory access patterns of a task. Secondly, we apply this model to analyze the worst case response time for a set of tasks. Although the required parameters to obtain the request profile can be obtained by static analysis, we provide an alternative method to experimentally obtain them by using performance monitoring counters (PMCs). We also compare our work against an existing approach and show that our approach outperforms it by providing tighter upper-bound on the number of bus requests generated by a task.
Resumo:
Consider the problem of scheduling a set of sporadic tasks on a multiprocessor system to meet deadlines using a tasksplitting scheduling algorithm. Task-splitting (also called semipartitioning) scheduling algorithms assign most tasks to just one processor but a few tasks are assigned to two or more processors, and they are dispatched in a way that ensures that a task never executes on two or more processors simultaneously. A certain type of task-splitting algorithms, called slot-based task-splitting, is of particular interest because of its ability to schedule tasks at high processor utilizations. We present a new schedulability analysis for slot-based task-splitting scheduling algorithms that takes the overhead into account and also a new task assignment algorithm.
Resumo:
This article describes a finite element-based formulation for the statistical analysis of the response of stochastic structural composite systems whose material properties are described by random fields. A first-order technique is used to obtain the second-order statistics for the structural response considering means and variances of the displacement and stress fields of plate or shell composite structures. Propagation of uncertainties depends on sensitivities taken as measurement of variation effects. The adjoint variable method is used to obtain the sensitivity matrix. This method is appropriated for composite structures due to the large number of random input parameters. Dominant effects on the stochastic characteristics are studied analyzing the influence of different random parameters. In particular, a study of the anisotropy influence on uncertainties propagation of angle-ply composites is carried out based on the proposed approach.
Resumo:
In the last years, several solutions have been proposed to extend PROFIBUS in order to support wired and wireless network stations in the same network. In this paper we compare two of those solutions, one in which the interconnection between wired and wireless stations is made by repeaters and another in which the interconnection is made by bridges. The comparison is both qualitative and numerical, based on simulation models of both architectures.
Resumo:
Glass fibre-reinforced plastics (GFRP) have been considered inherently difficult to recycle due to both: cross-linked nature of thermoset resins, which cannot be remolded, and complex composition of the composite itself. Presently, most of the GFRP waste is landfilled leading to negative environmental impacts and supplementary added costs. With an increasing awareness of environmental matters and the subsequent desire to save resources, recycling would convert an expensive waste disposal into a profitable reusable material. In this study, efforts were made in order to recycle grinded GFRP waste, proceeding from pultrusion production scrap, into new and sustainable composite materials. For this purpose, GFRP waste recyclates, were incorporated into polyester based mortars as fine aggregate and filler replacements at different load contents and particle size distributions. Potential recycling solution was assessed by mechanical behaviour of resultant GFRP waste modified polymer mortars. Results revealed that GFRP waste filled polymer mortars present improved flexural and compressive behaviour over unmodified polyester based mortars, thus indicating the feasibility of the waste reuse in polymer mortars and concrete. © 2011, Advanced Engineering Solutions.
Resumo:
Hard real- time multiprocessor scheduling has seen, in recent years, the flourishing of semi-partitioned scheduling algorithms. This category of scheduling schemes combines elements of partitioned and global scheduling for the purposes of achieving efficient utilization of the system’s processing resources with strong schedulability guarantees and with low dispatching overheads. The sub-class of slot-based “task-splitting” scheduling algorithms, in particular, offers very good trade-offs between schedulability guarantees (in the form of high utilization bounds) and the number of preemptions/migrations involved. However, so far there did not exist unified scheduling theory for such algorithms; each one was formulated in its own accompanying analysis. This article changes this fragmented landscape by formulating a more unified schedulability theory covering the two state-of-the-art slot-based semi-partitioned algorithms, S-EKG and NPS-F (both fixed job-priority based). This new theory is based on exact schedulability tests, thus also overcoming many sources of pessimism in existing analysis. In turn, since schedulability testing guides the task assignment under the schemes in consideration, we also formulate an improved task assignment procedure. As the other main contribution of this article, and as a response to the fact that many unrealistic assumptions, present in the original theory, tend to undermine the theoretical potential of such scheduling schemes, we identified and modelled into the new analysis all overheads incurred by the algorithms in consideration. The outcome is a new overhead-aware schedulability analysis that permits increased efficiency and reliability. The merits of this new theory are evaluated by an extensive set of experiments.
Resumo:
This document presents a tool able to automatically gather data provided by real energy markets and to generate scenarios, capture and improve market players’ profiles and strategies by using knowledge discovery processes in databases supported by artificial intelligence techniques, data mining algorithms and machine learning methods. It provides the means for generating scenarios with different dimensions and characteristics, ensuring the representation of real and adapted markets, and their participating entities. The scenarios generator module enhances the MASCEM (Multi-Agent Simulator of Competitive Electricity Markets) simulator, endowing a more effective tool for decision support. The achievements from the implementation of the proposed module enables researchers and electricity markets’ participating entities to analyze data, create real scenarios and make experiments with them. On the other hand, applying knowledge discovery techniques to real data also allows the improvement of MASCEM agents’ profiles and strategies resulting in a better representation of real market players’ behavior. This work aims to improve the comprehension of electricity markets and the interactions among the involved entities through adequate multi-agent simulation.
Resumo:
Electricity markets are complex environments, involving a large number of different entities, with specific characteristics and objectives, making their decisions and interacting in a dynamic scene. Game-theory has been widely used to support decisions in competitive environments; therefore its application in electricity markets can prove to be a high potential tool. This paper proposes a new scenario analysis algorithm, which includes the application of game-theory, to evaluate and preview different scenarios and provide players with the ability to strategically react in order to exhibit the behavior that better fits their objectives. This model includes forecasts of competitor players’ actions, to build models of their behavior, in order to define the most probable expected scenarios. Once the scenarios are defined, game theory is applied to support the choice of the action to be performed. Our use of game theory is intended for supporting one specific agent and not for achieving the equilibrium in the market. MASCEM (Multi-Agent System for Competitive Electricity Markets) is a multi-agent electricity market simulator that models market players and simulates their operation in the market. The scenario analysis algorithm has been tested within MASCEM and our experimental findings with a case study based on real data from the Iberian Electricity Market are presented and discussed.
Resumo:
The current economic crisis has rushed even more the economists’ concerns to identify new directions for the sustainable development of the society. In this context, the human capital is crystallised as the key variable of the creative economy and of the knowledge-based society. As such, we have directed the research underlying this paper to identifying the most eloquent indicators of human capital to meet the demands of the knowledge-based society and sustainable development as well as towards achieving a comprehensive analysis of the human capital in the EU countries, respectively of a comparative analysis: Romania - Portugal. To carry out this paper, the methodology used is based on the interdisciplinary triangulation involving approaches from the perspective of human resource management, economy and economic statistics. The research techniques used consist of the content analysis and investigation of secondary data of international organisations accredited in the field of this research, such as: the United Nation Development Programme - Human Development Reports, World Bank - World Development Reports, International Labour Organisation, Eurostat, European Commission’s Eurobarometer surveys and reports on human capital. The research results emphasise both similarities and differences between the two countries under the comparative analysis and the main directions in which one has to invest for the development of human capital.
Resumo:
Over the past decades several approaches for schedulability analysis have been proposed for both uni-processor and multi-processor real-time systems. Although different techniques are employed, very little has been put forward in using formal specifications, with the consequent possibility for mis-interpretations or ambiguities in the problem statement. Using a logic based approach to schedulability analysis in the design of hard real-time systems eases the synthesis of correct-by-construction procedures for both static and dynamic verification processes. In this paper we propose a novel approach to schedulability analysis based on a timed temporal logic with time durations. Our approach subsumes classical methods for uni-processor scheduling analysis over compositional resource models by providing the developer with counter-examples, and by ruling out schedules that cause unsafe violations on the system. We also provide an example showing the effectiveness of our proposal.
Resumo:
Astringency is an organoleptic property of beverages and food products resulting mainly from the interaction of salivary proteins with dietary polyphenols. It is of great importance to consumers, but the only effective way of measuring it involves trained sensorial panellists, providing subjective and expensive responses. Concurrent chemical evaluations try to screen food astringency, by means of polyphenol and protein precipitation procedures, but these are far from the real human astringency sensation where not all polyphenol–protein interactions lead to the occurrence of precipitate. Here, a novel chemical approach that tries to mimic protein–polyphenol interactions in the mouth is presented to evaluate astringency. A protein, acting as a salivary protein, is attached to a solid support to which the polyphenol binds (just as happens when drinking wine), with subsequent colour alteration that is fully independent from the occurrence of precipitate. Employing this simple concept, Bovine Serum Albumin (BSA) was selected as the model salivary protein and used to cover the surface of silica beads. Tannic Acid (TA), employed as the model polyphenol, was allowed to interact with the BSA on the silica support and its adsorption to the protein was detected by reaction with Fe(III) and subsequent colour development. Quantitative data of TA in the samples were extracted by colorimetric or reflectance studies over the solid materials. The analysis was done by taking a regular picture with a digital camera, opening the image file in common software and extracting the colour coordinates from HSL (Hue, Saturation, Lightness) and RGB (Red, Green, Blue) colour model systems; linear ranges were observed from 10.6 to 106.0 μmol L−1. The latter was based on the Kubelka–Munk response, showing a linear gain with concentrations from 0.3 to 10.5 μmol L−1. In either of these two approaches, semi-quantitative estimation of TA was enabled by direct eye comparison. The correlation between the levels of adsorbed TA and the astringency of beverages was tested by using the assay to check the astringency of wines and comparing these to the response of sensorial panellists. Results of the two methods correlated well. The proposed sensor has significant potential as a robust tool for the quantitative/semi-quantitative evaluation of astringency in wine.
Resumo:
Sulfamethoxazole (SMX) is among the antibiotics employed in aquaculture for prophylactic and therapeutic reasons. Environmental and food spread may be prevented by controlling its levels in several stages of fish farming. The present work proposes for this purpose new SMX selective electrodes for the potentiometric determination of this sulphonamide in water. The selective membranes were made of polyvinyl chloride (PVC) with tetraphenylporphyrin manganese (III) chloride or cyclodextrin-based acting as ionophores. 2-nitrophenyl octyl ether was employed as plasticizer and tetraoctylammonium, dimethyldioctadecylammonium bromide or potassium tetrakis (4-chlorophenyl) borate was used as anionic or cationic additive. The best analytical performance was reported for ISEs of tetraphenylporphyrin manganese (III) chloride with 50% mol of potassium tetrakis (4-chlorophenyl) borate compared to ionophore. Nersntian behaviour was observed from 4.0 × 10−5 to 1.0 × 10−2 mol/L (10.0 to 2500 µg/mL), and the limit of detection was 1.2 × 10−5 mol/L (3.0 µg/mL). In general, the electrodes displayed steady potentials in the pH range of 6 to 9. Emf equilibrium was reached before 15 s in all concentration levels. The electrodes revealed good discriminating ability in environmental samples. The analytical application to contaminated waters showed recoveries from 96 to 106%.
Resumo:
IEEE International Conference on Communications (IEEE ICC 2015). 8 to 12, Jun, 2015, IEEE ICC 2015 - Communications QoS, Reliability and Modeling, London, United Kingdom.
Resumo:
Objective Public health organizations recommend that preschool-aged children accumulate at least 3 h of physical activity (PA) daily. Objective monitoring using pedometers offers an opportunity to measure preschooler's PA and assess compliance with this recommendation. The purpose of this study was to derive step-based recommendations consistent with the 3 h PA recommendation for preschool-aged children. Method The study sample comprised 916 preschool-aged children, aged 3 to 6 years (mean age = 5.0 ± 0.8 years). Children were recruited from kindergartens located in Portugal, between 2009 and 2013. Children wore an ActiGraph GT1M accelerometer that measured PA intensity and steps per day simultaneously over a 7-day monitoring period. Receiver operating characteristic (ROC) curve analysis was used to identify the daily step count threshold associated with meeting the daily 3 hour PA recommendation. Results A significant correlation was observed between minutes of total PA and steps per day (r = 0.76, p < 0.001). The optimal step count for ≥ 3 h of total PA was 9099 steps per day (sensitivity (90%) and specificity (66%)) with area under the ROC curve = 0.86 (95% CI: 0.84 to 0.88). Conclusion Preschool-aged children who accumulate less than 9000 steps per day may be considered Insufficiently Active.
Resumo:
The complexity of systems is considered an obstacle to the progress of the IT industry. Autonomic computing is presented as the alternative to cope with the growing complexity. It is a holistic approach, in which the systems are able to configure, heal, optimize, and protect by themselves. Web-based applications are an example of systems where the complexity is high. The number of components, their interoperability, and workload variations are factors that may lead to performance failures or unavailability scenarios. The occurrence of these scenarios affects the revenue and reputation of businesses that rely on these types of applications. In this article, we present a self-healing framework for Web-based applications (SHõWA). SHõWA is composed by several modules, which monitor the application, analyze the data to detect and pinpoint anomalies, and execute recovery actions autonomously. The monitoring is done by a small aspect-oriented programming agent. This agent does not require changes to the application source code and includes adaptive and selective algorithms to regulate the level of monitoring. The anomalies are detected and pinpointed by means of statistical correlation. The data analysis detects changes in the server response time and analyzes if those changes are correlated with the workload or are due to a performance anomaly. In the presence of per- formance anomalies, the data analysis pinpoints the anomaly. Upon the pinpointing of anomalies, SHõWA executes a recovery procedure. We also present a study about the detection and localization of anomalies, the accuracy of the data analysis, and the performance impact induced by SHõWA. Two benchmarking applications, exercised through dynamic workloads, and different types of anomaly were considered in the study. The results reveal that (1) the capacity of SHõWA to detect and pinpoint anomalies while the number of end users affected is low; (2) SHõWA was able to detect anomalies without raising any false alarm; and (3) SHõWA does not induce a significant performance overhead (throughput was affected in less than 1%, and the response time delay was no more than 2 milliseconds).