975 resultados para Global Processing Speed
Resumo:
When dealing with nonlinear blind processing algorithms (deconvolution or post-nonlinear source separation), complex mathematical estimations must be done giving as a result very slow algorithms. This is the case, for example, in speech processing, spike signals deconvolution or microarray data analysis. In this paper, we propose a simple method to reduce computational time for the inversion of Wiener systems or the separation of post-nonlinear mixtures, by using a linear approximation in a minimum mutual information algorithm. Simulation results demonstrate that linear spline interpolation is fast and accurate, obtaining very good results (similar to those obtained without approximation) while computational time is dramatically decreased. On the other hand, cubic spline interpolation also obtains similar good results, but due to its intrinsic complexity, the global algorithm is much more slow and hence not useful for our purpose.
Resumo:
Working memory, commonly defined as the ability to hold mental representations on line transiently and to manipulate these representations, is known to be a core deficit in schizophrenia. The aim of the present study was to investigate the visuo-spatial component of the working memory in schizophrenia, and more precisely to what extent the dynamic visuo-spatial information processing is impaired in schizophrenia patients. For this purpose we used a computerized paradigm in which 29 patients with schizophrenia (DSMIV, Diagnostic Interview for Genetic Studies) and 29 age and sex matched control subjects (DIGS) had to memorize a plane moving across the computer screen and to identify the observed trajectory among 9 plots proposed together. Each trajectory could be seen max. 3 times if needed. The results showed no difference between schizophrenia patients and controls regarding the number of correct trajectory identified after the first presentation. However, when we determine the mean number of correct trajectories on the basis of 3 trials, we observed that schizophrenia patients are significantly less performant than controls (Mann-Whitney, p _ 0.002). These findings suggest that, although schizophrenia patients are able to memorize some dynamic trajectories as well as controls, they do not profit from the repetition of the trajectory presentation. These findings are congruent with the hypothesis that schizophrenia could induce an unbalance between local and global information processing: the patients may be able to focus on details of the trajectory which could allow them to find the right target (bottom-up processes), but may show difficulty to refer to previous experience in order to filter incoming information (top-down processes) and enhance their visuo-spatial working memory abilities.
Resumo:
Les catastrophes sont souvent perçues comme des événements rapides et aléatoires. Si les déclencheurs peuvent être soudains, les catastrophes, elles, sont le résultat d'une accumulation des conséquences d'actions et de décisions inappropriées ainsi que du changement global. Pour modifier cette perception du risque, des outils de sensibilisation sont nécessaires. Des méthodes quantitatives ont été développées et ont permis d'identifier la distribution et les facteurs sous- jacents du risque.¦Le risque de catastrophes résulte de l'intersection entre aléas, exposition et vulnérabilité. La fréquence et l'intensité des aléas peuvent être influencées par le changement climatique ou le déclin des écosystèmes, la croissance démographique augmente l'exposition, alors que l'évolution du niveau de développement affecte la vulnérabilité. Chacune de ses composantes pouvant changer, le risque est dynamique et doit être réévalué périodiquement par les gouvernements, les assurances ou les agences de développement. Au niveau global, ces analyses sont souvent effectuées à l'aide de base de données sur les pertes enregistrées. Nos résultats montrent que celles-ci sont susceptibles d'être biaisées notamment par l'amélioration de l'accès à l'information. Elles ne sont pas exhaustives et ne donnent pas d'information sur l'exposition, l'intensité ou la vulnérabilité. Une nouvelle approche, indépendante des pertes reportées, est donc nécessaire.¦Les recherches présentées ici ont été mandatées par les Nations Unies et par des agences oeuvrant dans le développement et l'environnement (PNUD, l'UNISDR, la GTZ, le PNUE ou l'UICN). Ces organismes avaient besoin d'une évaluation quantitative sur les facteurs sous-jacents du risque, afin de sensibiliser les décideurs et pour la priorisation des projets de réduction des risques de désastres.¦La méthode est basée sur les systèmes d'information géographique, la télédétection, les bases de données et l'analyse statistique. Une importante quantité de données (1,7 Tb) et plusieurs milliers d'heures de calculs ont été nécessaires. Un modèle de risque global a été élaboré pour révéler la distribution des aléas, de l'exposition et des risques, ainsi que pour l'identification des facteurs de risque sous- jacent de plusieurs aléas (inondations, cyclones tropicaux, séismes et glissements de terrain). Deux indexes de risque multiples ont été générés pour comparer les pays. Les résultats incluent une évaluation du rôle de l'intensité de l'aléa, de l'exposition, de la pauvreté, de la gouvernance dans la configuration et les tendances du risque. Il apparaît que les facteurs de vulnérabilité changent en fonction du type d'aléa, et contrairement à l'exposition, leur poids décroît quand l'intensité augmente.¦Au niveau local, la méthode a été testée pour mettre en évidence l'influence du changement climatique et du déclin des écosystèmes sur l'aléa. Dans le nord du Pakistan, la déforestation induit une augmentation de la susceptibilité des glissements de terrain. Les recherches menées au Pérou (à base d'imagerie satellitaire et de collecte de données au sol) révèlent un retrait glaciaire rapide et donnent une évaluation du volume de glace restante ainsi que des scénarios sur l'évolution possible.¦Ces résultats ont été présentés à des publics différents, notamment en face de 160 gouvernements. Les résultats et les données générées sont accessibles en ligne (http://preview.grid.unep.ch). La méthode est flexible et facilement transposable à des échelles et problématiques différentes, offrant de bonnes perspectives pour l'adaptation à d'autres domaines de recherche.¦La caractérisation du risque au niveau global et l'identification du rôle des écosystèmes dans le risque de catastrophe est en plein développement. Ces recherches ont révélés de nombreux défis, certains ont été résolus, d'autres sont restés des limitations. Cependant, il apparaît clairement que le niveau de développement configure line grande partie des risques de catastrophes. La dynamique du risque est gouvernée principalement par le changement global.¦Disasters are often perceived as fast and random events. If the triggers may be sudden, disasters are the result of an accumulation of actions, consequences from inappropriate decisions and from global change. To modify this perception of risk, advocacy tools are needed. Quantitative methods have been developed to identify the distribution and the underlying factors of risk.¦Disaster risk is resulting from the intersection of hazards, exposure and vulnerability. The frequency and intensity of hazards can be influenced by climate change or by the decline of ecosystems. Population growth increases the exposure, while changes in the level of development affect the vulnerability. Given that each of its components may change, the risk is dynamic and should be reviewed periodically by governments, insurance companies or development agencies. At the global level, these analyses are often performed using databases on reported losses. Our results show that these are likely to be biased in particular by improvements in access to information. International losses databases are not exhaustive and do not give information on exposure, the intensity or vulnerability. A new approach, independent of reported losses, is necessary.¦The researches presented here have been mandated by the United Nations and agencies working in the development and the environment (UNDP, UNISDR, GTZ, UNEP and IUCN). These organizations needed a quantitative assessment of the underlying factors of risk, to raise awareness amongst policymakers and to prioritize disaster risk reduction projects.¦The method is based on geographic information systems, remote sensing, databases and statistical analysis. It required a large amount of data (1.7 Tb of data on both the physical environment and socio-economic parameters) and several thousand hours of processing were necessary. A comprehensive risk model was developed to reveal the distribution of hazards, exposure and risk, and to identify underlying risk factors. These were performed for several hazards (e.g. floods, tropical cyclones, earthquakes and landslides). Two different multiple risk indexes were generated to compare countries. The results include an evaluation of the role of the intensity of the hazard, exposure, poverty, governance in the pattern and trends of risk. It appears that the vulnerability factors change depending on the type of hazard, and contrary to the exposure, their weight decreases as the intensity increases.¦Locally, the method was tested to highlight the influence of climate change and the ecosystems decline on the hazard. In northern Pakistan, deforestation exacerbates the susceptibility of landslides. Researches in Peru (based on satellite imagery and ground data collection) revealed a rapid glacier retreat and give an assessment of the remaining ice volume as well as scenarios of possible evolution.¦These results were presented to different audiences, including in front of 160 governments. The results and data generated are made available online through an open source SDI (http://preview.grid.unep.ch). The method is flexible and easily transferable to different scales and issues, with good prospects for adaptation to other research areas. The risk characterization at a global level and identifying the role of ecosystems in disaster risk is booming. These researches have revealed many challenges, some were resolved, while others remained limitations. However, it is clear that the level of development, and more over, unsustainable development, configures a large part of disaster risk and that the dynamics of risk is primarily governed by global change.
Resumo:
Peer-reviewed
Resumo:
VALOSADE (Value Added Logistics in Supply and Demand Chains) is the research project of Anita Lukka's VALORE (Value Added Logistics Research) research team inLappeenranta University of Technology. VALOSADE is included in ELO (Ebusiness logistics) technology program of Tekes (Finnish Technology Agency). SMILE (SME-sector, Internet applications and Logistical Efficiency) is one of four subprojects of VALOSADE. SMILE research focuses on case network that is composed of small and medium sized mechanical maintenance service providers and global wood processing customers. Basic principle of SMILE study is communication and ebusiness insupply and demand network. This first phase of research concentrates on creating backgrounds for SMILE study and for ebusiness solutions of maintenance case network. The focus is on general trends of ebusiness in supply chains and networksof different industries; total ebusiness system architecture of company networks; ebusiness strategy of company network; information value chain; different factors, which influence on ebusiness solution of company network; and the correlation between ebusiness and competitive advantage. Literature, interviews and benchmarking were used as research methods in this qualitative case study. Networks and end-to-end supply chains are the organizational structures, which can add value for end customer. Information is one of the key factors in these decentralized structures. Because of decentralization of business, information is produced and used in different companies and in different information systems. Information refinement services are needed to manage information flows in company networksbetween different systems. Furthermore, some new solutions like network information systems are utilised in optimising network performance and in standardizingnetwork common processes. Some cases have however indicated, that utilization of ebusiness in decentralized business model is not always a necessity, but value-add of ICT must be defined case-specifically. In the theory part of report, different ebusiness and architecture models are introduced. These models are compared to empirical case data in research results. The biggest difference between theory and empirical data is that models are mainly developed for large-scale companies - not for SMEs. This is due to that implemented network ebusiness solutions are mainly large company centered. Genuine SME network centred ebusiness models are quite rare, and the study in that area has been few in number. Business relationships between customer and their SME suppliers are nowadays concentrated more on collaborative tactical and strategic initiatives besides transaction based operational initiatives. However, ebusiness systems are further mainly based on exchange of operational transactional data. Collaborative ebusiness solutions are in planning or pilot phase in most case companies. Furthermore, many ebusiness solutions are nowadays between two participants, but network and end-to-end supply chain transparency and information systems are quite rare. Transaction volumes, data formats, the types of exchanged information, information criticality,type and duration of business relationship, internal information systems of partners, processes and operation models (e.g. different ordering models) differ among network companies, and furthermore companies are at different stages on networking and ebusiness readiness. Because of former factors, different customer-supplier combinations in network must utilise totally different ebusiness architectures, technologies, systems and standards.
Resumo:
Perceiving the world visually is a basic act for humans, but for computers it is still an unsolved problem. The variability present innatural environments is an obstacle for effective computer vision. The goal of invariant object recognition is to recognise objects in a digital image despite variations in, for example, pose, lighting or occlusion. In this study, invariant object recognition is considered from the viewpoint of feature extraction. Thedifferences between local and global features are studied with emphasis on Hough transform and Gabor filtering based feature extraction. The methods are examined with respect to four capabilities: generality, invariance, stability, and efficiency. Invariant features are presented using both Hough transform and Gabor filtering. A modified Hough transform technique is also presented where the distortion tolerance is increased by incorporating local information. In addition, methods for decreasing the computational costs of the Hough transform employing parallel processing and local information are introduced.
Resumo:
Diplomityön tavoitteena oli sopivimman yritysostokohteen valitseminen useiden kilpailijoiden joukosta puunkäsittelykoneiden toimittajalle. Ensin esiteltiin Suomen metsäteollisuus sekä sen osaamistarpeista noussut metsäklusteri pääosin kohdeyrityksen näkökulmasta. Seuraavaksi annettiin kuva yrityksen tuotteista, kilpailijoista ja asiakkaista. Yritysostoprosessi kuvattiin sekä esille tuotiin yleiset motiivit ja kriittiset menestystekijät. Lisäksi kuvattiin kilpailijoiden ja liiketoimintaympäristön analysointi yrityksen menestyksen edellytyksenä. Puuntyöstökoneiden markkinat segmentoitiin ja analysoitiin vuodesta 1990 aina tähän päivään asti, jotta löydettäisiin kehityskelpoiset osa-alueet eli alueet, joissa yrityksen markkinaosuutta voitaisiin kasvattaa. Kandidaattien ominaisuuksia verrattiin yritysoston motiiveihin. Yritysten tuotteet sekä maantieteellinen sijainti pisteytettiin, jotta sopivimmat yritykset nousisivat esille. Kolme yritystä valittiin syvällisempään tarkasteluun. Yritysten tuotteita, taloudellista asemaa ja globaalia verkostoa vertailtiin keskenään muiden tekijöiden, kuten maailmantalouden ohessa. Taloudellisesti vakaa ja teknisesti monipuolinen yritys kohtasi yritysoston motiivit parhaiten. Kohteen positiivisia puolia olivat sijainti, tuotteet ja palvelut. Lisäksi, yritys sopii ostajan strategiaan sekä auttaa kohtaamaan asiakkaiden nykyiset ja tulevat tarpeet.
Resumo:
Each day, Earth's finite resources are being depleted for energy, for material goods, for transportation, for housing, and for drugs. As we evolve scientifically and technologically, and as the population of the world rapidly approaches 7 billion and beyond, among the many issues with which we are faced is the continued availability of drugs for future global health care. Medicinal agents are primarily derived from two sources, synthetic and natural, or in some cases, as semi-synthetic compounds, a mixture of the two. For the developed world, efforts have been initiated to make drug production "greener", with milder reagents, shorter reaction times, and more efficient processing, thereby using less energy, and reactions which are more atom efficient, and generate fewer by-products. However, most of the world's population uses plants, in either crude or extract form, for their primary health care. There is relatively little discussion as yet, about the long term effects of the current, non-sustainable harvesting methods for medicinal plants from the wild, which are depleting these critical resources without concurrent initiatives to commercialize their cultivation. To meet future public health care needs, a paradigm shift is required in order to adopt new approaches using contemporary technology which will result in drugs being regarded as a sustainable commodity, irrespective of their source. In this presentation, several approaches to enhancing and sustaining the availability of drugs, both synthetic and natural, will be discussed, including the use of vegetables as chemical reagents, and the deployment of integrated strategies involving information systems, biotechnology, nanotechnology, and detection techniques for the development of medicinal plants with enhanced levels of bioactive agents.
Resumo:
Fan systems are responsible for approximately 10% of the electricity consumption in industrial and municipal sectors, and it has been found that there is energy-saving potential in these systems. To this end, variable speed drives (VSDs) are used to enhance the efficiency of fan systems. Usually, fan system operation is optimized based on measurements of the system, but there are seldom readily installed meters in the system that can be used for the purpose. Thus, sensorless methods are needed for the optimization of fan system operation. In this thesis, methods for the fan operating point estimation with a variable speed drive are studied and discussed. These methods can be used for the energy efficient control of the fan system without additional measurements. The operation of these methods is validated by laboratory measurements and data from an industrial fan system. In addition to their energy consumption, condition monitoring of fan systems is a key issue as fans are an integral part of various production processes. Fan system condition monitoring is usually carried out with vibration measurements, which again increase the system complexity. However, variable speed drives can already be used for pumping system condition monitoring. Therefore, it would add to the usability of a variablespeed- driven fan system if the variable speed drive could be used as a condition monitoring device. In this thesis, sensorless detection methods for three lifetime-reducing phenomena are suggested: these are detection of the fan contamination build-up, the correct rotational direction, and the fan surge. The methods use the variable speed drive monitoring and control options for the detection along with simple signal processing methods, such as power spectrum density estimates. The methods have been validated by laboratory measurements. The key finding of this doctoral thesis is that a variable speed drive can be used on its own as a monitoring and control device for the fan system energy efficiency, and it can also be used in the detection of certain lifetime-reducing phenomena.
Resumo:
In this thesis, the suitability of different trackers for finger tracking in high-speed videos was studied. Tracked finger trajectories from the videos were post-processed and analysed using various filtering and smoothing methods. Position derivatives of the trajectories, speed and acceleration were extracted for the purposes of hand motion analysis. Overall, two methods, Kernelized Correlation Filters and Spatio-Temporal Context Learning tracking, performed better than the others in the tests. Both achieved high accuracy for the selected high-speed videos and also allowed real-time processing, being able to process over 500 frames per second. In addition, the results showed that different filtering methods can be applied to produce more appropriate velocity and acceleration curves calculated from the tracking data. Local Regression filtering and Unscented Kalman Smoother gave the best results in the tests. Furthermore, the results show that tracking and filtering methods are suitable for high-speed hand-tracking and trajectory-data post-processing.
Resumo:
Nowadays global business trends force the adoption of innovative ICTs into the supply chain management (SCM). Particularly, the RFID technology is on high demand among SCM professionals due to its business advantages such as improving of accuracy and veloc-ity of SCM processes which lead to decrease of operational costs. Nevertheless, a question of the RFID technology impact on the efficiency of warehouse processes in the SCM re-mains open. The goal of the present study is to experiment the possibility of improvement order picking velocity in a warehouse of a big logistics company with the use of the RFID technology. In order to achieve this goal the following objectives have been developed: 1) Defining the scope of the RFID technology applications in the SCM; 2) Justification of the RFID technology impact on the SCM processes; 3) Defining a place of the warehouse order picking process in the SCM; 4) Identification and systematization of existing meth-ods of order picking velocity improvement; 5) Choosing of the study object and gathering of the empirical data about number of orders, number of hours spent per each order line daily during 5 months; 6) Processing and analysis of the empirical data; 7) Conclusion about the impact of the RFID technology on the speed of order picking process. As a result of the research it has been found that the speed of the order picking processes has not been changed as time has gone after the RFID adoption. It has been concluded that in order to achieve a positive effect in the speed of order picking process with the use of the RFID technology it is necessary to simultaneously implement changes in logistics and organizational management in 3PL logistics companies. Practical recommendations have been forwarded to the management of the company for further investigation and procedure.
Resumo:
Feature extraction is the part of pattern recognition, where the sensor data is transformed into a more suitable form for the machine to interpret. The purpose of this step is also to reduce the amount of information passed to the next stages of the system, and to preserve the essential information in the view of discriminating the data into different classes. For instance, in the case of image analysis the actual image intensities are vulnerable to various environmental effects, such as lighting changes and the feature extraction can be used as means for detecting features, which are invariant to certain types of illumination changes. Finally, classification tries to make decisions based on the previously transformed data. The main focus of this thesis is on developing new methods for the embedded feature extraction based on local non-parametric image descriptors. Also, feature analysis is carried out for the selected image features. Low-level Local Binary Pattern (LBP) based features are in a main role in the analysis. In the embedded domain, the pattern recognition system must usually meet strict performance constraints, such as high speed, compact size and low power consumption. The characteristics of the final system can be seen as a trade-off between these metrics, which is largely affected by the decisions made during the implementation phase. The implementation alternatives of the LBP based feature extraction are explored in the embedded domain in the context of focal-plane vision processors. In particular, the thesis demonstrates the LBP extraction with MIPA4k massively parallel focal-plane processor IC. Also higher level processing is incorporated to this framework, by means of a framework for implementing a single chip face recognition system. Furthermore, a new method for determining optical flow based on LBPs, designed in particular to the embedded domain is presented. Inspired by some of the principles observed through the feature analysis of the Local Binary Patterns, an extension to the well known non-parametric rank transform is proposed, and its performance is evaluated in face recognition experiments with a standard dataset. Finally, an a priori model where the LBPs are seen as combinations of n-tuples is also presented
Resumo:
This study is done to examine waste power plant’s optimal processing chain and it is important to consider from several points of view on why one option is better than the other. This is to insure that the right decision is made. Incineration of waste has devel-oped to be one decent option for waste disposal. There are several legislation matters and technical options to consider when starting up a waste power plant. From the tech-niques pretreatment, burner and flue gas cleaning are the biggest ones to consider. The treatment of incineration residues is important since it can be very harmful for the envi-ronment. The actual energy production from waste is not highly efficient and there are several harmful compounds emitted. Recycling of waste before incineration is not very typical and there are not many recycling options for materials that cannot be easily re-cycled to same product. Life cycle assessment is a good option for studying the envi-ronmental effect of the system. It has four phases that are part of the iterative study process. In this study the case environment is a waste power plant. The modeling of the plant is done with GaBi 6 software and the scope is from gate-to-grave. There are three different scenarios, from which the first and second are compared to each other to reach conclusions. Zero scenario is part of the study to demonstrate situation without the power plant. The power plant in this study is recycling some materials in scenario one and in scenario two even more materials and utilize the bottom ash more ways than one. The model has the substitutive processes for the materials when they are not recycled in the plant. The global warming potential results show that scenario one is the best option. The variable costs that have been considered tell the same result. The conclusion is that the waste power plant should not recycle more and utilize bottom ash in a number of ways. The area is not ready for that kind of utilization and production from recycled materials.
Resumo:
Brazil is one of the three largest producers of fruits in the world, and among those fruit trees, the cashew tree stands out due to the high nutritional and commercial value of its products. During its fruit processing, there are losses in some compounds and few studies address this issue. Over the last decade the conventional system of food production has been substituted for the organic cultivation system, which is a promising alternative source of income given the global demand for healthy food. Therefore, this research aimed to characterize and quantify the prevalent fatty acids found in cashew nuts obtained from conventional and organic cultivation during various stages of processing. The prevalent fatty acids found were palmitic, linoleic, oleic, and stearic acid. The average of these fatty acids were 6.93 ± 0.55; 16.99 ± 0.61; 67.62 ± 1.00 and 8.42 ± 0.55 g/100 g, respectively. There was no reduction in the palmitic, oleic and stearic fatty acid contents during processing. Very little difference was observed between the nuts obtained from conventional and organic cultivation, indicating that the method of cultivation used has little or no influence on the content of cashew nut fatty acids.
Resumo:
The opportunity to supplement common cassava biscuits with a product of higher nutritional value meets consumer expectations. In this work it was studied the effects of process parameters and flaxseed addition on physical properties of expanded snacks. Extrusion process was carried out using a single screw extruder in a factorial central composite rotatable design with four factors: flaxseed flour percentage (0-20%), moisture (12-20%), extrusion temperature (90-130 °C) and screw speed (190-270). The effect of extrusion variables was investigated in terms of expansion index, specific volume, water absorption index, water solubility index, color parameters (L*, a* ,b*) and hardness. The data analysis showed that variable parameters of the extrusion process and flaxseed flour affected physical properties of puffed snacks. Among the experimental conditions used in the present study, expanded snack products with good physical properties can be obtained under the conditions of 10% flaxseed flour, 230 rpm screw speed, temperature of 90 °C and moisture of 12%.