901 resultados para Techniques of data analysis


Relevância:

100.00% 100.00%

Publicador:

Resumo:

Tutkimuksen selvitettiin miten skenaarioanalyysia voidaan käyttää uuden teknologian tutkimisessa. Työssä havaittiin, että skenaarioanalyysin soveltuvuuteen vaikuttaa eniten teknologisen muutoksen taso ja saatavilla olevan tiedon luonne. Skenaariomenetelmä soveltuu hyvin uusien teknologioiden tutkimukseen erityisesti radikaalien innovaatioiden kohdalla. Syynä tähän on niihin liittyvä suuri epävarmuus, kompleksisuus ja vallitsevan paradigman muuttuminen, joiden takia useat muut tulevaisuuden tutkimuksen menetelmät eivät ole tilanteessa käyttökelpoisia. Työn empiirisessä osiossa tutkittiin hilaverkkoteknologian tulevaisuutta skenaarioanalyysin avulla. Hilaverkot nähtiin mahdollisena disruptiivisena teknologiana, joka radikaalina innovaationa saattaa muuttaa tietokonelaskennan nykyisestä tuotepohjaisesta laskentakapasiteetin ostamisesta palvelupohjaiseksi. Tällä olisi suuri vaikutus koko nykyiseen ICT-toimialaan erityisesti tarvelaskennan hyödyntämisen ansiosta. Tutkimus tarkasteli kehitystä vuoteen 2010 asti. Teorian ja olemassa olevan tiedon perusteella muodostettiin vahvaan asiantuntijatietouteen nojautuen neljä mahdollista ympäristöskenaariota hilaverkoille. Skenaarioista huomattiin, että teknologian kaupallinen menestys on vielä monen haasteen takana. Erityisesti luottamus ja lisäarvon synnyttäminen nousivat tärkeimmiksi hilaverkkojen tulevaisuutta ohjaaviksi tekijöiksi.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The introduction of time-series graphs into British economics in the 19th century depended on the « timing » of history. This involved reconceptualizing history into events which were both comparable and measurable and standardized by time unit. Yet classical economists in Britain in the early 19th century viewed history as a set of heterogenous and complex events and statistical tables as giving unrelated facts. Both these attitudes had to be broken down before time-series graphs could be brought into use for revealing regularities in economic events by the century's end.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The increasing prevalence of obesity and its associated complications requires specialized care to improve outcomes and control health care costs. Obesity is associated with numerous serious and costly medical problems requiring specialized care in managing health. The economic burden of obesity includes increased inpatient and outpatient medical expenditures as well as employer-related issues of absenteeism and associate costs. The objectives of this study are: - To describe the health consequences and the economic burden of obesity, - To review the existing treatment - To argue in favor of a specialized nutritional intervention that has shown to improve health and reduce obesity related health care costs. Therefore, expose the possibility of introducing the specialized nutrition in Switzerland and the feasibility of this project considering the medical trends and reimbursement system in Switzerland The benefits and outcomes for the patients will be the significant weight loss which reduces the severity and risk factors for complications and the improved health and quality of life. Weight loss will be a combination of a diet, exercise and behavioral interventions which are the basic recommendations for obesity treatment in addition to the specialized nutritional support. By nutritional support, we mean products that are intended to provide nutritional support in the dietary management of people with specific diseases and conditions when adequate intake of regular foods is compromised. These products are called, Food for special medical purposes FSMP. They are not intended to treat, cure, prevent, mitigate or have a direct impact on disease in a manner similar to drugs or other medical treatments and should be used under medical supervision. They also provide a low cost alternative to surgery. From a health care system perspective, the specialized nutrition will drive its advantage by reducing the utilization of medical services for obesity associated complications like medication, physician's consultations and surgical interventions arriving to a cost effective care for the hospitals, the health care organizations and the third party payers which are the health insurances. [Author, p. 4]

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Neural signal processing is a discipline within neuroengineering. This interdisciplinary approach combines principles from machine learning, signal processing theory, and computational neuroscience applied to problems in basic and clinical neuroscience. The ultimate goal of neuroengineering is a technological revolution, where machines would interact in real time with the brain. Machines and brains could interface, enabling normal function in cases of injury or disease, brain monitoring, and/or medical rehabilitation of brain disorders. Much current research in neuroengineering is focused on understanding the coding and processing of information in the sensory and motor systems, quantifying how this processing is altered in the pathological state, and how it can be manipulated through interactions with artificial devices including brain–computer interfaces and neuroprosthetics.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Recent years have produced great advances in the instrumentation technology. The amount of available data has been increasing due to the simplicity, speed and accuracy of current spectroscopic instruments. Most of these data are, however, meaningless without a proper analysis. This has been one of the reasons for the overgrowing success of multivariate handling of such data. Industrial data is commonly not designed data; in other words, there is no exact experimental design, but rather the data have been collected as a routine procedure during an industrial process. This makes certain demands on the multivariate modeling, as the selection of samples and variables can have an enormous effect. Common approaches in the modeling of industrial data are PCA (principal component analysis) and PLS (projection to latent structures or partial least squares) but there are also other methods that should be considered. The more advanced methods include multi block modeling and nonlinear modeling. In this thesis it is shown that the results of data analysis vary according to the modeling approach used, thus making the selection of the modeling approach dependent on the purpose of the model. If the model is intended to provide accurate predictions, the approach should be different than in the case where the purpose of modeling is mostly to obtain information about the variables and the process. For industrial applicability it is essential that the methods are robust and sufficiently simple to apply. In this way the methods and the results can be compared and an approach selected that is suitable for the intended purpose. Differences in data analysis methods are compared with data from different fields of industry in this thesis. In the first two papers, the multi block method is considered for data originating from the oil and fertilizer industries. The results are compared to those from PLS and priority PLS. The third paper considers applicability of multivariate models to process control for a reactive crystallization process. In the fourth paper, nonlinear modeling is examined with a data set from the oil industry. The response has a nonlinear relation to the descriptor matrix, and the results are compared between linear modeling, polynomial PLS and nonlinear modeling using nonlinear score vectors.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Radiostereometric analysis (RSA) is a highly accurate method for the measurement of in vivo micromotion of orthopaedic implants. Validation of the RSA method is a prerequisite for performing clinical RSA studies. Only a limited number of studies have utilised the RSA method in the evaluation of migration and inducible micromotion during fracture healing. Volar plate fixation of distal radial fractures has increased in popularity. There is still very little prospective randomised evidence supporting the use of these implants over other treatments. The aim of this study was to investigate the precision, accuracy, and feasibility of using RSA in the evaluation of healing in distal radius fractures treated with a volar fixed-angle plate. A physical phantom model was used to validate the RSA method for simple distal radius fractures. A computer simulation model was then used to validate the RSA method for more complex interfragmentary motion in intra-articular fractures. A separate pre-clinical investigation was performed in order to evaluate the possibility of using novel resorbable markers for RSA. Based on the validation studies, a prospective RSA cohort study of fifteen patients with plated AO type-C distal radius fractures with a 1-year follow-up was performed. RSA was shown to be highly accurate and precise in the measurement of fracture micromotion using both physical and computer simulated models of distal radius fractures. Resorbable RSA markers demonstrated potential for use in RSA. The RSA method was found to have a high clinical precision. The fractures underwent significant translational and rotational migration during the first two weeks after surgery, but not thereafter. Maximal grip caused significant translational and rotational interfragmentary micromotion. This inducible micromotion was detectable up to eighteen weeks, even after the achievement of radiographic union. The application of RSA in the measurement of fracture fragment migration and inducible interfragmentary micromotion in AO type-C distal radius fractures is feasible but technically demanding. RSA may be a unique tool in defining the progress of fracture union.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

This study aimed at identifying different conditions of coffee plants after harvesting period, using data mining and spectral behavior profiles from Hyperion/EO1 sensor. The Hyperion image, with spatial resolution of 30 m, was acquired in August 28th, 2008, at the end of the coffee harvest season in the studied area. For pre-processing imaging, atmospheric and signal/noise effect corrections were carried out using Flaash and MNF (Minimum Noise Fraction Transform) algorithms, respectively. Spectral behavior profiles (38) of different coffee varieties were generated from 150 Hyperion bands. The spectral behavior profiles were analyzed by Expectation-Maximization (EM) algorithm considering 2; 3; 4 and 5 clusters. T-test with 5% of significance was used to verify the similarity among the wavelength cluster means. The results demonstrated that it is possible to separate five different clusters, which were comprised by different coffee crop conditions making possible to improve future intervention actions.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

View angle and directional effects significantly affect reflectance and vegetation indices, especially when daily images collected by large field-of-view (FOV) sensors like the Moderate Resolution Imaging Spectroradiometer (MODIS) are used. In this study, the PROSAIL radiative transfer model was chosen to evaluate the impact of the geometry of data acquisition on soybean reflectance and two vegetation indices (Normalized Difference Vegetation Index - NDVI and Enhanced Vegetation Index -EVI) by varying biochemical and biophysical parameters of the crop. Input values for PROSAIL simulation were based on the literature and were adjusted by the comparison between simulated and real satellite soybean spectra acquired by the MODIS/Terra and hyperspectral Hyperion/Earth Observing-One (EO-1). Results showed that the influence of the view angle and view direction on reflectance was stronger with decreasing leaf area index (LAI) and chlorophyll concentration. Because of the greater dependence on the near-infrared reflectance, the EVI was much more sensitive to viewing geometry than NDVI presenting larger values in the backscattering direction. The contrary was observed for NDVI in the forward scattering direction. In relation to the LAI, NDVI was much more isotropic for closed soybean canopies than for incomplete canopies and a contrary behavior was verified for EVI.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

ABSTRACT The present study aims to present the main concepts of the sugarcane straw to energy planning. Throughout the study, the subject is contextualized highlighting broader aspects of sustainability, which is considered the main driver towards agro-energy modernization. Concerning sugarcane straw, we first evaluated its availability regarding technical and economic aspects, and then it summarized the straw production chain for energy supply purposes. As a proposal to support agro-energy planning, it is presented some spatial tools that have been barely used in the Brazilian energy planning context so far. Therefore, working on straw to electricity associated with supply chain basis, we developed a conceptual model to spatially assess this bioenergy system. Using the model proposed, it is described the whole supply chain at state level, which accounted the potential of a single mill to explore straw, as well as main costs associated with straw acquisition, investments on the straw recovery routes and electricity transmission. Bearing these concepts in mind, it is fully believed that spatial analysis can bring important information for agro-energy action plans.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

One of the most crucial tasks for a company offering a software product is to decide what new features should be implemented in the product’s forthcoming versions. Yet, existing studies show that this is also a task with which many companies are struggling. This problem has been claimed to be ambiguous and changing. There are better or worse solutions to the problem, but no optimal one. Furthermore, the criteria determining the success of the solution keeps changing due to continuously changing competition, technologies and market needs. This thesis seeks to gain a deeper understanding of the challenges that companies have reportedly faced in determining the requirements for their forthcoming product versions. To this end, product management related activities are explored in seven companies. Following grounded theory approach, the thesis conducts four iterations of data analysis, where each of the iterations goes beyond the previous one. The thesis results in a theory proposal intended to 1) describe the essential characteristics of organizations’ product management challenges, 2) explain the origins of the perceived challenges and 3) suggest strategies to alleviate the perceived challenges. The thesis concludes that current product management approaches are becoming inadequate to deal with challenges that have multiple and conflicting interpretations, different value orientations, unclear goals, contradictions and paradoxes. This inadequacy continues to increase until current beliefs and assumptions about the product management challenges are questioned and a new paradigm for dealing with the challenges is adopted.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Markku Laitinen's keynote presentation in the QQML conference in Limerick, Ireland the 23rd of April, 2012.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Outlier detection is an important form of data analysis because outliers in several cases contain the interesting and important pieces of information. In the recent years, many different outlier detection algorithms have been devised for finding different kinds of outliers in varying contexts and environments. Some effort has been put to study how to effectively combine different outlier detection methods. The combination of outlier detection algorithms as an ensemble was studied in this thesis by designing a modular framework for outlier detection, which combines arbitrary outlier detection techniques. This work resulted in an example implementation of the framework. Outlier detection capability of the ensemble method was validated using datasets and methods found in outlier detection research. The framework achieved better results than the individual outlier algorithms. Future research includes how to handle large datasets effectively and the possibilities for real-time outlier monitoring.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The aim of this study was to identify and map the weed population in a no-tillage area. Geostatistical techniques were used in the mapping in order to assess this information as a tool for the localized application of herbicides. The area of study is 58.08 hectares wide and was sampled in a fixed square grid (which point spaced 50 m, 232 points) using a GPS receiver. In each point the weeds species and population were analyzed in a square with a 0.25 m2 fixed area. The species Ipomoea grandifolia, Gnaphalium spicatum, Richardia spp. and Emilia sonchifolia have presented no spatial dependence. However, the species Conyza spp., C. echinatus and E. indica have shown a spatial correlation. Among the models tested, the spherical model has shown had a better fit for Conyza spp. and Eleusine indica and the Gaussian model for Cenchrus echinatus. The three species have a clumped spatial distribution. The mapping of weeds can be a tool for localized control, making herbicide use more rational, effective and economical.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Presentation at Open Repositories 2014, Helsinki, Finland, June 9-13, 2014