260 resultados para net-based pedagogy


Relevância:

30.00% 30.00%

Publicador:

Resumo:

The increasing number of television channels, on-demand services and online content, is expected to contribute to a better quality of experience for a costumer of such a service. However, the lack of efficient methods for finding the right content, adapted to personal interests, may lead to a progressive loss of clients. In such a scenario, recommendation systems are seen as a tool that can fill this gap and contribute to the loyalty of users. Multimedia content, namely films and television programmes are usually described using a set of metadata elements that include the title, a genre, the date of production, and the list of directors and actors. This paper provides a deep study on how the use of different metadata elements can contribute to increase the quality of the recommendations suggested. The analysis is conducted using Netflix and Movielens datasets and aspects such as the granularity of the descriptions, the accuracy metric used and the sparsity of the data are taken into account. Comparisons with collaborative approaches are also presented.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

It is unquestionable that an effective decision concerning the usage of a certain environmental clean-up technology should be conveniently supported. Significant amount of scientific work focussing on the reduction of nitrate concentration in drinking water by both metallic iron and nanomaterials and their usage in permeable reactive barriers has been worldwide published over the last two decades. This work aims to present in a systematic review of the most relevant research done on the removal of nitrate from groundwater using nanosized iron based permeable reactive barriers. The research was based on scientific papers published between 2004 and June 2014. It was performed using 16 combinations of keywords in 34 databases, according to PRISMA statement guidelines. Independent reviewers validated the selection criteria. From the 4161 records filtered, 45 met the selection criteria and were selected to be included in this review. This study's outcomes show that the permeable reactive barriers are, indeed, a suitable technology for denitrification and with good performance record but the long-term impact of the use of nanosized zero valent iron in this remediation process, in both on the environment and on the human health, is far to be conveniently known. As a consequence, further work is required on this matter, so that nanosized iron based permeable reactive barriers for the removal of nitrate from drinking water can be genuinely considered an eco-efficient technology.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This work aims to evaluate the feasibility of using image-based cytometry (IBC) in the analysis of algal cell quantification and viability, using Pseudokirchneriella subcapitata as a cell model. Cell concentration was determined by IBC to be in a linear range between 1 × 105 and 8 × 106 cells mL−1. Algal viability was defined on the basis that the intact membrane of viable cells excludes the SYTOX Green (SG) probe. The disruption of membrane integrity represents irreversible damage and consequently results in cell death. Using IBC, we were able to successfully discriminate between live (SG-negative cells) and dead algal cells (heat-treated at 65 °C for 60 min; SG-positive cells). The observed viability of algal populations containing different proportions of killed cells was well correlated (R 2 = 0.994) with the theoretical viability. The validation of the use of this technology was carried out by exposing algal cells of P. subcapitata to a copper stress test for 96 h. IBC allowed us to follow the evolution of cell concentration and the viability of copper-exposed algal populations. This technology overcomes several main drawbacks usually associated with microscopy counting, such as labour-intensive experiments, tedious work and lack of the representativeness of the cell counting. In conclusion, IBC allowed a fast and automated determination of the total number of algal cells and allowed us to analyse viability. This technology can provide a useful tool for a wide variety of fields that utilise microalgae, such as the aquatic toxicology and biotechnology fields.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This work explores the use of fluorescent probes to evaluate the responses of the green alga Pseudokirchneriella subcapitata to the action of three nominal concentrations of Cd(II), Cr(VI), Cu(II) and Zn(II) for a short time (6 h). The toxic effect of the metals on algal cells was monitored using the fluorochromes SYTOX Green (SG, membrane integrity), fluorescein diacetate (FDA, esterase activity) and rhodamine 123 (Rh123, mitochondrial membrane potential). The impact of metals on chlorophyll a (Chl a) autofluorescence was also evaluated. Esterase activity was the most sensitive parameter. At the concentrations studied, all metals induced the loss of esterase activity. SG could be used to effectively detect the loss of membrane integrity in algal cells exposed to 0.32 or 1.3 μmol L−1 Cu(II). Rh123 revealed a decrease in the mitochondrial membrane potential of algal cells exposed to 0.32 and 1.3 μmol L−1 Cu(II), indicating that mitochondrial activity was compromised. Chl a autofluorescence was also affected by the presence of Cr(VI) and Cu(II), suggesting perturbation of photosynthesis. In conclusion, the fluorescence-based approach was useful for detecting the disturbance of specific cellular characteristics. Fluorescent probes are a useful diagnostic tool for the assessment of the impact of toxicants on specific targets of P. subcapitata algal cells.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This paper addresses the challenging task of computing multiple roots of a system of nonlinear equations. A repulsion algorithm that invokes the Nelder-Mead (N-M) local search method and uses a penalty-type merit function based on the error function, known as 'erf', is presented. In the N-M algorithm context, different strategies are proposed to enhance the quality of the solutions and improve the overall efficiency. The main goal of this paper is to use a two-level factorial design of experiments to analyze the statistical significance of the observed differences in selected performance criteria produced when testing different strategies in the N-M based repulsion algorithm. The main goal of this paper is to use a two-level factorial design of experiments to analyze the statistical significance of the observed differences in selected performance criteria produced when testing different strategies in the N-M based repulsion algorithm.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Sectorization means dividing a set of basic units into sectors or parts, a procedure that occurs in several contexts, such as political, health and school districting, social networks and sales territory or airspace assignment, to achieve some goal or to facilitate an activity. This presentation will focus on three main issues: Measures, a new approach to sectorization problems and an application in waste collection. When designing or comparing sectors different characteristics are usually taken into account. Some are commonly used, and they are related to the concepts of contiguity, equilibrium and compactness. These fundamental characteristics will be addressed, by defining new generic measures and by proposing a new measure, desirability, connected with the idea of preference. A new approach to sectorization inspired in Coulomb’s Law, which establishes a relation of force between electrically charged points, will be proposed. A charged point represents a small region with specific characteristics/values creating relations of attraction/repulsion with the others (two by two), proportional to the charges and inversely proportional to their distance. Finally, a real case about sectorization and vehicle routing in solid waste collection will be mentioned.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The Internet of Things (IoT) has emerged as a paradigm over the last few years as a result of the tight integration of the computing and the physical world. The requirement of remote sensing makes low-power wireless sensor networks one of the key enabling technologies of IoT. These networks encompass several challenges, especially in communication and networking, due to their inherent constraints of low-power features, deployment in harsh and lossy environments, and limited computing and storage resources. The IPv6 Routing Protocol for Low Power and Lossy Networks (RPL) [1] was proposed by the IETF ROLL (Routing Over Low-power Lossy links) working group and is currently adopted as an IETF standard in the RFC 6550 since March 2012. Although RPL greatly satisfied the requirements of low-power and lossy sensor networks, several issues remain open for improvement and specification, in particular with respect to Quality of Service (QoS) guarantees and support for mobility. In this paper, we focus mainly on the RPL routing protocol. We propose some enhancements to the standard specification in order to provide QoS guarantees for static as well as mobile LLNs. For this purpose, we propose OF-FL (Objective Function based on Fuzzy Logic), a new objective function that overcomes the limitations of the standardized objective functions that were designed for RPL by considering important link and node metrics, namely end-to-end delay, number of hops, ETX (Expected transmission count) and LQL (Link Quality Level). In addition, we present the design of Co-RPL, an extension to RPL based on the corona mechanism that supports mobility in order to overcome the problem of slow reactivity to frequent topology changes and thus providing a better quality of service mainly in dynamic networks application. Performance evaluation results show that both OF-FL and Co-RPL allow a great improvement when compared to the standard specification, mainly in terms of packet loss ratio and average network latency. 2015 Elsevier B.V. Al

Relevância:

30.00% 30.00%

Publicador:

Resumo:

For efficient planning of waste collection routing, large municipalities may be partitioned into convenient sectors. The real case under consideration is the municipality of Monção, in Portugal. Waste collection involves more than 1600 containers over an area of 220 km2 and a population of around 20,000 inhabitants. This is mostly a rural area where the population is distributed in small villages around the 33 boroughs centres (freguesia) that constitute the municipality. In most freguesias, waste collection is usually conducted 3 times a week. However, there are situations in which the same collection is done every day. The case reveals some general and specific characteristics which are not rare, but are not widely addressed in the literature. Furthermore, new methods and models to deal with sectorization and routing are introduced, which can be extended to other applications. Sectorization and routing are tackled following a three-phase approach. The first phase, which is the main concern of the presentation, introduces a new method for sectorization inspired by Electromagnetism and Coulomb’s Law. The matter is not only about territorial division, but also the frequency of waste collection, which is a critical issue in these types of applications. Special characteristics related to the number and type of deposition points were also a motivation for this work. The second phase addresses the routing problems in each sector: new Mixed Capacitated Arc Routing with Limited Multi-Landfills models will be presented. The last phase integrates Sectoring and Routing. Computational results confirm the effectiveness of the entire novel approach.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Background Musicians are frequently affected by playing-related musculoskeletal disorders (PRMD). Common solutions used by Western medicine to treat musculoskeletal pain include rehabilitation programs and drugs, but their results are sometimes disappointing. Objective To study the effects of self-administered exercises based on Tuina techniques on the pain intensity caused by PRMD of professional orchestra musicians, using numeric visual scale (NVS). Design, setting, participants and interventions We performed a prospective, controlled, single-blinded, randomized study with musicians suffering from PRMD. Participating musicians were randomly distributed into the experimental (n = 39) and the control (n = 30) groups. After an individual diagnostic assessment, specific Tuina self-administered exercises were developed and taught to the participants. Musicians were instructed to repeat the exercises every day for 3 weeks. Main outcome measures Pain intensity was measured by NVS before the intervention and after 1, 3, 5, 10, 15 and 20 d of treatment. The procedure was the same for the control group, however the Tuina exercises were executed in points away from the commonly-used acupuncture points. Results In the treatment group, but not the control group, pain intensity was significantly reduced on days 1, 3, 5, 10, 15 and 20. Conclusion The results obtained are consistent with the hypothesis that self-administered exercises based on Tuina techniques could help professional musicians controlling the pain caused by PRMD. Although our results are very promising, further studies are needed employing a larger sample size and double blinding designs.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Objective Public health organizations recommend that preschool-aged children accumulate at least 3 h of physical activity (PA) daily. Objective monitoring using pedometers offers an opportunity to measure preschooler's PA and assess compliance with this recommendation. The purpose of this study was to derive step-based recommendations consistent with the 3 h PA recommendation for preschool-aged children. Method The study sample comprised 916 preschool-aged children, aged 3 to 6 years (mean age = 5.0 ± 0.8 years). Children were recruited from kindergartens located in Portugal, between 2009 and 2013. Children wore an ActiGraph GT1M accelerometer that measured PA intensity and steps per day simultaneously over a 7-day monitoring period. Receiver operating characteristic (ROC) curve analysis was used to identify the daily step count threshold associated with meeting the daily 3 hour PA recommendation. Results A significant correlation was observed between minutes of total PA and steps per day (r = 0.76, p < 0.001). The optimal step count for ≥ 3 h of total PA was 9099 steps per day (sensitivity (90%) and specificity (66%)) with area under the ROC curve = 0.86 (95% CI: 0.84 to 0.88). Conclusion Preschool-aged children who accumulate less than 9000 steps per day may be considered Insufficiently Active.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Optimization methods have been used in many areas of knowledge, such as Engineering, Statistics, Chemistry, among others, to solve optimization problems. In many cases it is not possible to use derivative methods, due to the characteristics of the problem to be solved and/or its constraints, for example if the involved functions are non-smooth and/or their derivatives are not know. To solve this type of problems a Java based API has been implemented, which includes only derivative-free optimization methods, and that can be used to solve both constrained and unconstrained problems. For solving constrained problems, the classic Penalty and Barrier functions were included in the API. In this paper a new approach to Penalty and Barrier functions, based on Fuzzy Logic, is proposed. Two penalty functions, that impose a progressive penalization to solutions that violate the constraints, are discussed. The implemented functions impose a low penalization when the violation of the constraints is low and a heavy penalty when the violation is high. Numerical results, obtained using twenty-eight test problems, comparing the proposed Fuzzy Logic based functions to six of the classic Penalty and Barrier functions are presented. Considering the achieved results, it can be concluded that the proposed penalty functions besides being very robust also have a very good performance.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The complexity of systems is considered an obstacle to the progress of the IT industry. Autonomic computing is presented as the alternative to cope with the growing complexity. It is a holistic approach, in which the systems are able to configure, heal, optimize, and protect by themselves. Web-based applications are an example of systems where the complexity is high. The number of components, their interoperability, and workload variations are factors that may lead to performance failures or unavailability scenarios. The occurrence of these scenarios affects the revenue and reputation of businesses that rely on these types of applications. In this article, we present a self-healing framework for Web-based applications (SHõWA). SHõWA is composed by several modules, which monitor the application, analyze the data to detect and pinpoint anomalies, and execute recovery actions autonomously. The monitoring is done by a small aspect-oriented programming agent. This agent does not require changes to the application source code and includes adaptive and selective algorithms to regulate the level of monitoring. The anomalies are detected and pinpointed by means of statistical correlation. The data analysis detects changes in the server response time and analyzes if those changes are correlated with the workload or are due to a performance anomaly. In the presence of per- formance anomalies, the data analysis pinpoints the anomaly. Upon the pinpointing of anomalies, SHõWA executes a recovery procedure. We also present a study about the detection and localization of anomalies, the accuracy of the data analysis, and the performance impact induced by SHõWA. Two benchmarking applications, exercised through dynamic workloads, and different types of anomaly were considered in the study. The results reveal that (1) the capacity of SHõWA to detect and pinpoint anomalies while the number of end users affected is low; (2) SHõWA was able to detect anomalies without raising any false alarm; and (3) SHõWA does not induce a significant performance overhead (throughput was affected in less than 1%, and the response time delay was no more than 2 milliseconds).

Relevância:

30.00% 30.00%

Publicador:

Resumo:

In this paper, a linguistically rule-based grapheme-to-phone (G2P) transcription algorithm is described for European Portuguese. A complete set of phonological and phonetic transcription rules regarding the European Portuguese standard variety is presented. This algorithm was implemented and tested by using online newspaper articles. The obtained experimental results gave rise to 98.80% of accuracy rate. Future developments in order to increase this value are foreseen. Our purpose with this work is to develop a module/ tool that can improve synthetic speech naturalness in European Portuguese. Other applications of this system can be expected like language teaching/learning. These results, together with our perspectives of future improvements, have proved the dramatic importance of linguistic knowledge on the development of Text-to-Speech systems (TTS).

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The recent developments on Hidden Markov Models (HMM) based speech synthesis showed that this is a promising technology fully capable of competing with other established techniques. However some issues still lack a solution. Several authors report an over-smoothing phenomenon on both time and frequencies which decreases naturalness and sometimes intelligibility. In this work we present a new vowel intelligibility enhancement algorithm that uses a discrete Kalman filter (DKF) for tracking frame based parameters. The inter-frame correlations are modelled by an autoregressive structure which provides an underlying time frame dependency and can improve time-frequency resolution. The system’s performance has been evaluated using objective and subjective tests and the proposed methodology has led to improved results.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

In the last few years the number of systems and devices that use voice based interaction has grown significantly. For a continued use of these systems the interface must be reliable and pleasant in order to provide an optimal user experience. However there are currently very few studies that try to evaluate how good is a voice when the application is a speech based interface. In this paper we present a new automatic voice pleasantness classification system based on prosodic and acoustic patterns of voice preference. Our study is based on a multi-language database composed by female voices. In the objective performance evaluation the system achieved a 7.3% error rate.