940 resultados para Distance-based techniques


Relevância:

30.00% 30.00%

Publicador:

Resumo:

We propose an analytical method based on fourier transform infrared-attenuated total reflectance (FTIR-ATR) spectroscopy to detect the adulteration of petrodiesel and petrodiesel/palm biodiesel blends with African crude palm oil. The infrared spectral fingerprints from the sample analysis were used to perform principal components analysis (PCA) and to construct a prediction model using partial least squares (PLS) regression. The PCA results separated the samples into three groups, allowing identification of those subjected to adulteration with palm oil. The obtained model shows a good predictive capacity for determining the concentration of palm oil in petrodiesel/biodiesel blends. Advantages of the proposed method include cost-effectiveness and speed; it is also environmentally friendly.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

A novel superabsorbent hydrogel (SH) composite based on a poly(acrylamide-co-acrylate) matrix filled with nontronite (NONT), a Fe(III)-rich member of the smectite group of clay minerals, is described in this manuscript. A variety of techniques, including FTIR, XRD, TGA, and SEM/EDX, were utilized to characterize this original composite. Experimental data confirmed the SH composite formation and suggested NONT was completely dispersed in the polymeric matrix. Additionally, NONT improved the water uptake capacity of the final material, which exhibited fast absorption, low sensitivity to the presence of salt, high water retention and a pH sensitive properties. These preliminary data showed that the original SH composite prepared here possesses highly attractive properties for applications in areas such as the agriculture field, particularly as a soil conditioner.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

A novel Fe3+-selective and turn-on fluorescent probe 1 incorporating a rhodamine fluorophore and quinoline subunit was synthesized. Probe 1 displayed high selectivity for Fe3+ in CH3CN–H2O (95:5 v/v) in the presence of other relevant metal cations. Interaction with Fe3+ in 1:1 stoichiometry could trigger a significant fluorescence enhancement due to the formation of the ring-open form. The fluorescent response images were investigated by a novel Euclidean distance method based on red, green, and blue values. A linear relationship was observed between fluorescence intensity changes and Fe3+ concentrations from 7.3 × 10−7 to 3.6 × 10−5 mol L−1.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

OBJECTIVE: To describe the first experience of an Internet-based course for ophthalmology residents. METHOD: Twenty-three residents were invited to participate in the study; however, only 13 (56.52%) took part, performing the proposed activities and answering a questionnaire. RESULTS: Of the 13 participants, only five (38.46%) completed 100% of the tasks, three (23.07%) completed between 70 and 90%, two (15.38%) completed between 50 and 60% and three (23.07%) completed less than 10% of the tasks. Regarding the use of computers and the Internet in general, all the participants reported using the Internet daily. All of them also affirmed they use the internet to study or to conduct research. CONCLUSION: Despite the advantages of the Internet, medical residents are still very reluctant to its use. Considering the context of information and communication technologies, there is a pressing need to reformulate continuing medical education in order to meet the demand of this new developing world.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

INTRODUCTION: Web-based e-learning is a teaching tool increasingly used in many medical schools and specialist fields, including ophthalmology. AIMS: this pilot study aimed to develop internet-based course-based clinical cases and to evaluate the effectiveness of this method within a graduate medical education group. METHODS: this was an interventional randomized study. First, a website was built using a distance learning platform. Sixteen first-year ophthalmology residents were then divided into two randomized groups: one experimental group, which was submitted to the intervention (use of the e-learning site) and another control group, which was not submitted to the intervention. The students answered a printed clinical case and their scores were compared. RESULTS: there was no statistically significant difference between the groups. CONCLUSION: We were able to successfully develop the e-learning site and the respective clinical cases. Despite the fact that there was no statistically significant difference between the access and the non access group, the study was a pioneer in our department, since a clinical case online program had never previously been developed.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Visual data mining (VDM) tools employ information visualization techniques in order to represent large amounts of high-dimensional data graphically and to involve the user in exploring data at different levels of detail. The users are looking for outliers, patterns and models – in the form of clusters, classes, trends, and relationships – in different categories of data, i.e., financial, business information, etc. The focus of this thesis is the evaluation of multidimensional visualization techniques, especially from the business user’s perspective. We address three research problems. The first problem is the evaluation of projection-based visualizations with respect to their effectiveness in preserving the original distances between data points and the clustering structure of the data. In this respect, we propose the use of existing clustering validity measures. We illustrate their usefulness in evaluating five visualization techniques: Principal Components Analysis (PCA), Sammon’s Mapping, Self-Organizing Map (SOM), Radial Coordinate Visualization and Star Coordinates. The second problem is concerned with evaluating different visualization techniques as to their effectiveness in visual data mining of business data. For this purpose, we propose an inquiry evaluation technique and conduct the evaluation of nine visualization techniques. The visualizations under evaluation are Multiple Line Graphs, Permutation Matrix, Survey Plot, Scatter Plot Matrix, Parallel Coordinates, Treemap, PCA, Sammon’s Mapping and the SOM. The third problem is the evaluation of quality of use of VDM tools. We provide a conceptual framework for evaluating the quality of use of VDM tools and apply it to the evaluation of the SOM. In the evaluation, we use an inquiry technique for which we developed a questionnaire based on the proposed framework. The contributions of the thesis consist of three new evaluation techniques and the results obtained by applying these evaluation techniques. The thesis provides a systematic approach to evaluation of various visualization techniques. In this respect, first, we performed and described the evaluations in a systematic way, highlighting the evaluation activities, and their inputs and outputs. Secondly, we integrated the evaluation studies in the broad framework of usability evaluation. The results of the evaluations are intended to help developers and researchers of visualization systems to select appropriate visualization techniques in specific situations. The results of the evaluations also contribute to the understanding of the strengths and limitations of the visualization techniques evaluated and further to the improvement of these techniques.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Machine learning provides tools for automated construction of predictive models in data intensive areas of engineering and science. The family of regularized kernel methods have in the recent years become one of the mainstream approaches to machine learning, due to a number of advantages the methods share. The approach provides theoretically well-founded solutions to the problems of under- and overfitting, allows learning from structured data, and has been empirically demonstrated to yield high predictive performance on a wide range of application domains. Historically, the problems of classification and regression have gained the majority of attention in the field. In this thesis we focus on another type of learning problem, that of learning to rank. In learning to rank, the aim is from a set of past observations to learn a ranking function that can order new objects according to how well they match some underlying criterion of goodness. As an important special case of the setting, we can recover the bipartite ranking problem, corresponding to maximizing the area under the ROC curve (AUC) in binary classification. Ranking applications appear in a large variety of settings, examples encountered in this thesis include document retrieval in web search, recommender systems, information extraction and automated parsing of natural language. We consider the pairwise approach to learning to rank, where ranking models are learned by minimizing the expected probability of ranking any two randomly drawn test examples incorrectly. The development of computationally efficient kernel methods, based on this approach, has in the past proven to be challenging. Moreover, it is not clear what techniques for estimating the predictive performance of learned models are the most reliable in the ranking setting, and how the techniques can be implemented efficiently. The contributions of this thesis are as follows. First, we develop RankRLS, a computationally efficient kernel method for learning to rank, that is based on minimizing a regularized pairwise least-squares loss. In addition to training methods, we introduce a variety of algorithms for tasks such as model selection, multi-output learning, and cross-validation, based on computational shortcuts from matrix algebra. Second, we improve the fastest known training method for the linear version of the RankSVM algorithm, which is one of the most well established methods for learning to rank. Third, we study the combination of the empirical kernel map and reduced set approximation, which allows the large-scale training of kernel machines using linear solvers, and propose computationally efficient solutions to cross-validation when using the approach. Next, we explore the problem of reliable cross-validation when using AUC as a performance criterion, through an extensive simulation study. We demonstrate that the proposed leave-pair-out cross-validation approach leads to more reliable performance estimation than commonly used alternative approaches. Finally, we present a case study on applying machine learning to information extraction from biomedical literature, which combines several of the approaches considered in the thesis. The thesis is divided into two parts. Part I provides the background for the research work and summarizes the most central results, Part II consists of the five original research articles that are the main contribution of this thesis.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Resonance energy transfer (RET) is a non-radiative transfer of the excitation energy from the initially excited luminescent donor to an acceptor. The requirements for the resonance energy transfer are: i) the spectral overlap between the donor emission spectrum and the acceptor absorption spectrum, ii) the close proximity of the donor and the acceptor, and iii) the suitable relative orientations of the donor emission and the acceptor absorption transition dipoles. As a result of the RET process the donor luminescence intensity and the donor lifetime are decreased. If the acceptor is luminescent, a sensitized acceptor emission appears. The rate of RET depends strongly on the donor–acceptor distance (r) and is inversely proportional to r6. The distance dependence of RET is utilized in binding assays. The proximity requirement and the selective detection of the RET-modified emission signal allow homogeneous separation free assays. The term lanthanide-based RET is used when luminescent lanthanide compounds are used as donors. The long luminescence lifetimes, the large Stokes’ shifts and the intense, sharply-spiked emission spectra of the lanthanide donors offer advantages over the conventional organic donor molecules. Both the organic lanthanide chelates and the inorganic up-converting phosphor (UCP) particles have been used as donor labels in the RET based binding assays. In the present work lanthanide luminescence and lanthanide-based resonance energy transfer phenomena were studied. Luminescence lifetime measurements had an essential role in the research. Modular frequency-domain and time-domain luminometers were assembled and used successfully in the lifetime measurements. The frequency-domain luminometer operated in the low frequency domain ( 100 kHz) and utilized a novel dual-phase lock-in detection of the luminescence. One of the studied phenomena was the recently discovered non-overlapping fluorescence resonance energy transfer (nFRET). The studied properties were the distance and temperature dependences of nFRET. The distance dependence was found to deviate from the Förster theory and a clear temperature dependence was observed whereas conventional RET was completely independent of the temperature. Based on the experimental results two thermally activated mechanisms were proposed for the nFRET process. The work with the UCP particles involved the measurement of the luminescence properties of the UCP particles synthesized in our laboratory. The goal of the UCP particle research is to develop UCP donor labels for binding assays. In the present work the effect of the dopant concentrations and the core–shell structure on the total up-conversion luminescence intensity, the red–green emission ratio, and the luminescence lifetime was studied. Also the non-radiative nature of the energy transfer from the UCP particle donors to organic acceptors was demonstrated for the first time in aqueous environment and with a controlled donor–acceptor distance.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Information about capacity of transport and dispersion of soluble pollutants in natural streams are important in the management of water resources, especially in planning preventive measures to minimize the problems caused by accidental or intentional waste, in public health and economic activities that depend on the use of water. Considering this importance, this study aimed to develop a warning system for rivers, based on experimental techniques using tracers and analytical equations of one-dimensional transport of soluble pollutants conservative, to subsidizing the decision-making in the management of water resources. The system was development in JAVA programming language and MySQL database can predict the travel time of pollutants clouds from a point of eviction and graphically displays the temporal distribution of concentrations of passage clouds, in a particular location, downstream from the point of its launch.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The soybean is important to the economy of Brazil, so the estimation of the planted area and the production with higher antecedence and reliability becomes essential. Techniques related to Remote Sensing may help to obtain this information at lower cost and less subjectivity in relation to traditional surveys. The aim of this study is to estimate the planted area with soybean culture in the crop of 2008/2009 in cities in the west of the state of Paraná, in Brazil, based on the spectral dynamics of the culture and through the use of the specific system of analysis for images of Landsat 5/TM satellite. The obtained results were satisfactory, because the classification supervised by Maximum Verisimilitude - MaxVer along with the techniques of the specific system of analysis for satellite images has allowed an estimate of soybean planted area (soybean mask), obtaining values ​​of the metrics of Global Accuracy with an average of 79.05% and Kappa Index over 63.50% in all cities. The monitoring of a reference area was of great importance for determining the vegetative phase in which the culture is more different from the other targets, facilitating the choice of training samples (ROIs) and avoiding misclassifications.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The aim of this study was to group temporal profiles of 10-day composites NDVI product by similarity, which was obtained by the SPOT Vegetation sensor, for municipalities with high soybean production in the state of Paraná, Brazil, in the 2005/2006 cropping season. Data mining is a valuable tool that allows extracting knowledge from a database, identifying valid, new, potentially useful and understandable patterns. Therefore, it was used the methods for clusters generation by means of the algorithms K-Means, MAXVER and DBSCAN, implemented in the WEKA software package. Clusters were created based on the average temporal profiles of NDVI of the 277 municipalities with high soybean production in the state and the best results were found with the K-Means algorithm, grouping the municipalities into six clusters, considering the period from the beginning of October until the end of March, which is equivalent to the crop vegetative cycle. Half of the generated clusters presented spectro-temporal pattern, a characteristic of soybeans and were mostly under the soybean belt in the state of Paraná, which shows good results that were obtained with the proposed methodology as for identification of homogeneous areas. These results will be useful for the creation of regional soybean "masks" to estimate the planted area for this crop.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

In the network era, creative achievements like innovations are more and more often created in interaction among different actors. The complexity of today‘s problems transcends the individual human mind, requiring not only individual but also collective creativity. In collective creativity, it is impossible to trace the source of new ideas to an individual. Instead, creative activity emerges from the collaboration and contribution of many individuals, thereby blurring the contribution of specific individuals in creating ideas. Collective creativity is often associated with diversity of knowledge, skills, experiences and perspectives. Collaboration between diverse actors thus triggers creativity and gives possibilities for collective creativity. This dissertation investigates collective creativity in the context of practice-based innovation. Practice-based innovation processes are triggered by problem setting in a practical context and conducted in non-linear processes utilising scientific and practical knowledge production and creation in cross-disciplinary innovation networks. In these networks diversity or distances between innovation actors are essential. Innovation potential may be found in exploiting different kinds of distances. This dissertation presents different kinds of distances, such as cognitive, functional and organisational which could be considered as sources of creativity and thus innovation. However, formation and functioning of these kinds of innovation networks can be problematic. Distances between innovating actors may be so great that a special interpretation function is needed – that is, brokerage. This dissertation defines factors that enhance collective creativity in practice-based innovation and especially in the fuzzy front end phase of innovation processes. The first objective of this dissertation is to study individual and collective creativity at the employee level and identify those factors that support individual and collective creativity in the organisation. The second objective is to study how organisations use external knowledge to support collective creativity in their innovation processes in open multi-actor innovation. The third objective is to define how brokerage functions create possibilities for collective creativity especially in the context of practice-based innovation. The research objectives have been studied through five substudies using a case-study strategy. Each substudy highlights various aspects of creativity and collective creativity. The empirical data consist of materials from innovation projects arranged in the Lahti region, Finland, or materials from the development of innovation methods in the Lahti region. The Lahti region has been chosen as the research context because the innovation policy of the region emphasises especially the promotion of practice-based innovations. The results of this dissertation indicate that all possibilities of collective creativity are not utilised in internal operations of organisations. The dissertation introduces several factors that could support collective creativity in organisations. However, creativity as a social construct is understood and experienced differently in different organisations, and these differences should be taken into account when supporting creativity in organisations. The increasing complexity of most potential innovations requires collaborative creative efforts that often exceed the boundaries of the organisation and call for the involvement of external expertise. In practice-based innovation different distances are considered as sources of creativity. This dissertation gives practical implications on how it is possible to exploit different kinds of distances knowingly. It underlines especially the importance of brokerage functions in open, practice-based innovation in order to create possibilities for collective creativity. As a contribution of this dissertation, a model of brokerage functions in practice-based innovation is formulated. According to the model, the results and success of brokerage functions are based on the context of brokerage as well as the roles, tasks, skills and capabilities of brokers. The brokerage functions in practice-based innovation are also possible to divide into social and cognitive brokerage.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Formal software development processes and well-defined development methodologies are nowadays seen as the definite way to produce high-quality software within time-limits and budgets. The variety of such high-level methodologies is huge ranging from rigorous process frameworks like CMMI and RUP to more lightweight agile methodologies. The need for managing this variety and the fact that practically every software development organization has its own unique set of development processes and methods have created a profession of software process engineers. Different kinds of informal and formal software process modeling languages are essential tools for process engineers. These are used to define processes in a way which allows easy management of processes, for example process dissemination, process tailoring and process enactment. The process modeling languages are usually used as a tool for process engineering where the main focus is on the processes themselves. This dissertation has a different emphasis. The dissertation analyses modern software development process modeling from the software developers’ point of view. The goal of the dissertation is to investigate whether the software process modeling and the software process models aid software developers in their day-to-day work and what are the main mechanisms for this. The focus of the work is on the Software Process Engineering Metamodel (SPEM) framework which is currently one of the most influential process modeling notations in software engineering. The research theme is elaborated through six scientific articles which represent the dissertation research done with process modeling during an approximately five year period. The research follows the classical engineering research discipline where the current situation is analyzed, a potentially better solution is developed and finally its implications are analyzed. The research applies a variety of different research techniques ranging from literature surveys to qualitative studies done amongst software practitioners. The key finding of the dissertation is that software process modeling notations and techniques are usually developed in process engineering terms. As a consequence the connection between the process models and actual development work is loose. In addition, the modeling standards like SPEM are partially incomplete when it comes to pragmatic process modeling needs, like light-weight modeling and combining pre-defined process components. This leads to a situation, where the full potential of process modeling techniques for aiding the daily development activities can not be achieved. Despite these difficulties the dissertation shows that it is possible to use modeling standards like SPEM to aid software developers in their work. The dissertation presents a light-weight modeling technique, which software development teams can use to quickly analyze their work practices in a more objective manner. The dissertation also shows how process modeling can be used to more easily compare different software development situations and to analyze their differences in a systematic way. Models also help to share this knowledge with others. A qualitative study done amongst Finnish software practitioners verifies the conclusions of other studies in the dissertation. Although processes and development methodologies are seen as an essential part of software development, the process modeling techniques are rarely used during the daily development work. However, the potential of these techniques intrigues the practitioners. As a conclusion the dissertation shows that process modeling techniques, most commonly used as tools for process engineers, can also be used as tools for organizing the daily software development work. This work presents theoretical solutions for bringing the process modeling closer to the ground-level software development activities. These theories are proven feasible by presenting several case studies where the modeling techniques are used e.g. to find differences in the work methods of the members of a software team and to share the process knowledge to a wider audience.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Over the past decade, organizations worldwide have begun to widely adopt agile software development practices, which offer greater flexibility to frequently changing business requirements, better cost effectiveness due to minimization of waste, faster time-to-market, and closer collaboration between business and IT. At the same time, IT services are continuing to be increasingly outsourced to third parties providing the organizations with the ability to focus on their core capabilities as well as to take advantage of better demand scalability, access to specialized skills, and cost benefits. An output-based pricing model, where the customers pay directly for the functionality that was delivered rather than the effort spent, is quickly becoming a new trend in IT outsourcing allowing to transfer the risk away from the customer while at the same time offering much better incentives for the supplier to optimize processes and improve efficiency, and consequently producing a true win-win outcome. Despite the widespread adoption of both agile practices and output-based outsourcing, there is little formal research available on how the two can be effectively combined in practice. Moreover, little practical guidance exists on how companies can measure the performance of their agile projects, which are being delivered in an output-based outsourced environment. This research attempted to shed light on this issue by developing a practical project monitoring framework which may be readily applied by organizations to monitor the performance of agile projects in an output-based outsourcing context, thus taking advantage of the combined benefits of such an arrangement Modified from action research approach, this research was divided into two cycles, each consisting of the Identification, Analysis, Verification, and Conclusion phases. During Cycle 1, a list of six Key Performance Indicators (KPIs) was proposed and accepted by the professionals in the studied multinational organization, which formed the core of the proposed framework and answered the first research sub-question of what needs to be measured. In Cycle 2, a more in-depth analysis was provided for each of the suggested Key Performance Indicators including the techniques for capturing, calculating, and evaluating the information provided by each KPI. In the course of Cycle 2, the second research sub-question was answered, clarifying how the data for each KPI needed to be measured, interpreted, and acted upon. Consequently, after two incremental research cycles, the primary research question was answered describing the practical framework that may be used for monitoring the performance of agile IT projects delivered in an output-based outsourcing context. This framework was evaluated by the professionals within the context of the studied organization and received positive feedback across all four evaluation criteria set forth in this research, including the low overhead of data collection, high value of provided information, ease of understandability of the metric dashboard, and high generalizability of the proposed framework.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Multiprocessing is a promising solution to meet the requirements of near future applications. To get full benefit from parallel processing, a manycore system needs efficient, on-chip communication architecture. Networkon- Chip (NoC) is a general purpose communication concept that offers highthroughput, reduced power consumption, and keeps complexity in check by a regular composition of basic building blocks. This thesis presents power efficient communication approaches for networked many-core systems. We address a range of issues being important for designing power-efficient manycore systems at two different levels: the network-level and the router-level. From the network-level point of view, exploiting state-of-the-art concepts such as Globally Asynchronous Locally Synchronous (GALS), Voltage/ Frequency Island (VFI), and 3D Networks-on-Chip approaches may be a solution to the excessive power consumption demanded by today’s and future many-core systems. To this end, a low-cost 3D NoC architecture, based on high-speed GALS-based vertical channels, is proposed to mitigate high peak temperatures, power densities, and area footprints of vertical interconnects in 3D ICs. To further exploit the beneficial feature of a negligible inter-layer distance of 3D ICs, we propose a novel hybridization scheme for inter-layer communication. In addition, an efficient adaptive routing algorithm is presented which enables congestion-aware and reliable communication for the hybridized NoC architecture. An integrated monitoring and management platform on top of this architecture is also developed in order to implement more scalable power optimization techniques. From the router-level perspective, four design styles for implementing power-efficient reconfigurable interfaces in VFI-based NoC systems are proposed. To enhance the utilization of virtual channel buffers and to manage their power consumption, a partial virtual channel sharing method for NoC routers is devised and implemented. Extensive experiments with synthetic and real benchmarks show significant power savings and mitigated hotspots with similar performance compared to latest NoC architectures. The thesis concludes that careful codesigned elements from different network levels enable considerable power savings for many-core systems.