857 resultados para clustering and QoS-aware routing
Resumo:
This is an open access article distributed under the Creative Commons Attribution License, which permits unrestricted use, distribution, and reproduction in any medium, provided the original work is properly cited.
Resumo:
OBJECTIVE: To identify clustering areas of infants exposed to HIV during pregnancy and their association with indicators of primary care coverage and socioeconomic condition. METHODS: Ecological study where the unit of analysis was primary care coverage areas in the city of Porto Alegre, Southern Brazil, in 2003. Geographical Information System and spatial analysis tools were used to describe indicators of primary care coverage areas and socioeconomic condition, and estimate the prevalence of liveborn infants exposed to HIV during pregnancy and delivery. Data was obtained from Brazilian national databases. The association between different indicators was assessed using Spearman's nonparametric test. RESULTS: There was found an association between HIV infection and high birth rates (r=0.22, p<0.01) and lack of prenatal care (r=0.15, p<0.05). The highest HIV infection rates were seen in areas with poor socioeconomic conditions and difficult access to health services (r=0.28, p<0.01). The association found between higher rate of prenatal care among HIV-infected women and adequate immunization coverage (r=0.35, p<0.01) indicates that early detection of HIV infection is effective in those areas with better primary care services. CONCLUSIONS: Urban poverty is a strong determinant of mother-to-child HIV transmission but this trend can be fought with health surveillance at the primary care level.
Resumo:
One of the most difficult issues of e-Learning is the students’ assessment. Being this an outstanding task regarding theoretical topics, it becomes even more challenging when the topics under evaluation are practical. ISCAP’s Information Systems Department is composed of about twenty teachers who have been for several years using an e-learning environment (at the moment Moodle 2.3) combined with traditional assessment. They are now planning and implementing a new e-learning assessment strategy. This effort was undertaken in order to evaluate a practical topic (the use of spreadsheets to solve management problems) common to shared courses of several undergraduate degree programs. The same team group is already experienced in the assessment of theoretical information systems topics using the b-learning platform. Therefore, this project works as an extension to previous experiences being the team aware of the additional difficulties due to the practical nature of the topics. This paper describes this project and presents two cycles of the action research methodology, used to conduct the research. The first cycle goal was to produce a database of questions. When it was implemented in order to be used with a pilot group of students, several problems were identified. Subsequently, the second cycle consisted in solving the identified problems preparing the database and all the players to a broader scope implementation. For each cycle, all the phases, its drawbacks and achievements are described. This paper suits all those who are or are planning to be in the process of shifting their assessment strategy from a traditional to one supported by an e-learning platform.
Resumo:
Bologne came to globalize education in higher education, creating a unified architecture that potentiates higher education and enhances the continued interconnection of the spaces of education policy in higher education in the world, in particular in Europe. The aim of this work consists in the presentation of an identification model and skills’ classification and learning outcomes, based on the official documents of the course units (syllabus and assessment components) of a course of Higher Education. We are aware that the adoption of this model by different institutions, will contribute to interoperability learning outcomes, thus enhancing the mobility of teachers and students in the EHEA (European Higher Education Area) and third countries.
Resumo:
Economical development has always been connected to the commercial exchanges between people, due to the necessity to suppress their needs. With the increasing growth of international business and more competitive and demanding markets, exportation has become an important first step to internationalisation. Unlike what happened in the past, companies must be aware that the enrolment in the current global market is risky and requires elaborated technical procedures. Internationalisation should not be treated as an isolated event of business management. The first part of this paper aims to understand the export process and fit it in the current stage of international trade, keeping in mind the framework of export under the customs law. Then, we tried to understand how Portuguese companies should face this process in their internationalisation strategy, and what skills organisations must acquire to be able to export competitively in the current scenario of globalisation. The investigation was based on interviews in companies that, through a process of internationalisation by exportation, have implemented themselves strongly in extern markets. This investigation allowed us to analyse the companies’ motivations to become international, as well as the selection criteria for the export destinations. It was also possible to identify the main obstacles to the internationalisation of Portuguese companies. We concluded that companies that choose exportation as a way to become international acquire specific skills that enable them to become competitive in international trade. However, studies have failed to answer the second initial question about whether the measures implemented by Customs potentiate exports.
Resumo:
Seismic data is difficult to analyze and classical mathematical tools reveal strong limitations in exposing hidden relationships between earthquakes. In this paper, we study earthquake phenomena in the perspective of complex systems. Global seismic data, covering the period from 1962 up to 2011 is analyzed. The events, characterized by their magnitude, geographic location and time of occurrence, are divided into groups, either according to the Flinn-Engdahl (F-E) seismic regions of Earth or using a rectangular grid based in latitude and longitude coordinates. Two methods of analysis are considered and compared in this study. In a first method, the distributions of magnitudes are approximated by Gutenberg-Richter (G-R) distributions and the parameters used to reveal the relationships among regions. In the second method, the mutual information is calculated and adopted as a measure of similarity between regions. In both cases, using clustering analysis, visualization maps are generated, providing an intuitive and useful representation of the complex relationships that are present among seismic data. Such relationships might not be perceived on classical geographic maps. Therefore, the generated charts are a valid alternative to other visualization tools, for understanding the global behavior of earthquakes.
Resumo:
Clustering analysis is a useful tool to detect and monitor disease patterns and, consequently, to contribute for an effective population disease management. Portugal has the highest incidence of tuberculosis in the European Union (in 2012, 21.6 cases per 100.000 inhabitants), although it has been decreasing consistently. Two critical PTB (Pulmonary Tuberculosis) areas, metropolitan Oporto and metropolitan Lisbon regions, were previously identified through spatial and space-time clustering for PTB incidence rate and risk factors. Identifying clusters of temporal trends can further elucidate policy makers about municipalities showing a faster or a slower TB control improvement.
Resumo:
Research on the problem of feature selection for clustering continues to develop. This is a challenging task, mainly due to the absence of class labels to guide the search for relevant features. Categorical feature selection for clustering has rarely been addressed in the literature, with most of the proposed approaches having focused on numerical data. In this work, we propose an approach to simultaneously cluster categorical data and select a subset of relevant features. Our approach is based on a modification of a finite mixture model (of multinomial distributions), where a set of latent variables indicate the relevance of each feature. To estimate the model parameters, we implement a variant of the expectation-maximization algorithm that simultaneously selects the subset of relevant features, using a minimum message length criterion. The proposed approach compares favourably with two baseline methods: a filter based on an entropy measure and a wrapper based on mutual information. The results obtained on synthetic data illustrate the ability of the proposed expectation-maximization method to recover ground truth. An application to real data, referred to official statistics, shows its usefulness.
Resumo:
OBJECTIVE: To assess the determinants of the lack of pharmacological treatment for hypertension. METHODS: In 2005, 3,323 Mozambicans aged 25-64 years old were evaluated. Blood pressure, weight, height and smoking status were assessed following the Stepwise Approach to Chronic Disease Risk Factor Surveillance. Hypertensives (systolic blood pressure ≥ 140 mmHg and/or diastolic blood pressure ≥ 90 mmHg and/or antihypertensive drug therapy) were evaluated for awareness of their condition, pharmacological and non-pharmacological management, as well as use of herbal or traditional remedies. Prevalence ratios (PR) were calculated, adjusted for sociodemographic characteristics, cardiovascular risk factors and non-pharmacological treatment. RESULTS: Most of the hypertensive subjects (92.3%), and nearly half of those aware of their condition were not treated pharmacologically. Among the aware, the prevalence of untreated hypertension was higher in men {PR = 1.61; 95% confidence interval (95%CI 1.10;2.36)} and was lower in subjects under non-pharmacological treatment (PR = 0.58; 95%CI 0.42;0.79); there was no significant association with traditional treatments (PR = 0.75; 95%CI 0.44;1.26). CONCLUSIONS: The lack of pharmacological treatment for hypertension was more frequent in men, and was not influenced by the presence of other cardiovascular risk factors; it could not be explained by the use of alternative treatments as herbal/traditional medicines or non-pharmacological management. It is important to understand the reasons behind the lack of management of diagnosed hypertension and to implement appropriate corrective actions to reduce the gap in the access to healthcare between developed and developing countries.
Resumo:
Emerging smart grid systems must be able to react quickly and predictably, adapting their operation to changing energy supply and demand, by controlling energy consuming and energy storage devices. An intrinsic problem with smart grids is that energy produced from in-house renewable sources is affected by fluctuating weather factors. The applications driving smart grids operation must rely on a solid communication network that is secure, highly scalable, and always available. Thus, any communication infrastructure for smart grids should support its potential of producing high quantities of real-time data, with the goal of reacting to state changes by actuating on devices in real-time, while providing Quality of Service (QoS).
Resumo:
This paper focus on a demand response model analysis in a smart grid context considering a contingency scenario. A fuzzy clustering technique is applied on the developed demand response model and an analysis is performed for the contingency scenario. Model considerations and architecture are described. The demand response developed model aims to support consumers decisions regarding their consumption needs and possible economic benefits.
RadiaLE: A framework for designing and assessing link quality estimators in wireless sensor networks
Resumo:
Stringent cost and energy constraints impose the use of low-cost and low-power radio transceivers in large-scale wireless sensor networks (WSNs). This fact, together with the harsh characteristics of the physical environment, requires a rigorous WSN design. Mechanisms for WSN deployment and topology control, MAC and routing, resource and mobility management, greatly depend on reliable link quality estimators (LQEs). This paper describes the RadiaLE framework, which enables the experimental assessment, design and optimization of LQEs. RadiaLE comprises (i) the hardware components of the WSN testbed and (ii) a software tool for setting-up and controlling the experiments, automating link measurements gathering through packets-statistics collection, and analyzing the collected data, allowing for LQEs evaluation. We also propose a methodology that allows (i) to properly set different types of links and different types of traffic, (ii) to collect rich link measurements, and (iii) to validate LQEs using a holistic and unified approach. To demonstrate the validity and usefulness of RadiaLE, we present two case studies: the characterization of low-power links and a comparison between six representative LQEs. We also extend the second study for evaluating the accuracy of the TOSSIM 2 channel model.
Resumo:
Reliability of communications is key to expand application domains for sensor networks. SinceWireless Sensor Networks (WSN) operate in the license-free Industrial Scientific and Medical (ISM) bands and hence share the spectrum with other wireless technologies, addressing interference is an important challenge. In order to minimize its effect, nodes can dynamically adapt radio resources provided information about current spectrum usage is available. We present a new channel quality metric, based on availability of the channel over time, which meaningfully quantifies spectrum usage. We discuss the optimum scanning time for capturing the channel condition while maintaining energy-efficiency. Using data collected from a number of Wi-Fi networks operating in a library building, we show that our metric has strong correlation with the Packet Reception Rate (PRR). This suggests that quantifying interference in the channel can help in adapting resources for better reliability. We present a discussion of the usage of our metric for various resource allocation and adaptation strategies.
Resumo:
Preemptions account for a non-negligible overhead during system execution. There has been substantial amount of research on estimating the delay incurred due to the loss of working sets in the processor state (caches, registers, TLBs) and some on avoiding preemptions, or limiting the preemption cost. We present an algorithm to reduce preemptions by further delaying the start of execution of high priority tasks in fixed priority scheduling. Our approaches take advantage of the floating non-preemptive regions model and exploit the fact that, during the schedule, the relative task phasing will differ from the worst-case scenario in terms of admissible preemption deferral. Furthermore, approximations to reduce the complexity of the proposed approach are presented. Substantial set of experiments demonstrate that the approach and approximations improve over existing work, in particular for the case of high utilisation systems, where savings of up to 22% on the number of preemption are attained.
Resumo:
With progressing CMOS technology miniaturization, the leakage power consumption starts to dominate the dynamic power consumption. The recent technology trends have equipped the modern embedded processors with the several sleep states and reduced their overhead (energy/time) of the sleep transition. The dynamic voltage frequency scaling (DVFS) potential to save energy is diminishing due to efficient (low overhead) sleep states and increased static (leakage) power consumption. The state-of-the-art research on static power reduction at system level is based on assumptions that cannot easily be integrated into practical systems. We propose a novel enhanced race-to-halt approach (ERTH) to reduce the overall system energy consumption. The exhaustive simulations demonstrate the effectiveness of our approach showing an improvement of up to 8 % over an existing work.