978 resultados para computer forensics tools
Resumo:
The current research emphasizes on various questions raised and deliberated upon by different entrepreneurs. It provides a valuable contribution to comprehend the importance of social media and ICT-applications. Furthermore, it demonstrates how to support and implement the management consulting and business coaching start-ups with the help of social media and ICT-tools. The thesis presents a literary review from different information systems science, SME and e-business journals, web articles, as well as, survey analysis reports on social media applications. The methodology incorporated into a qualitative research method in which social anthropological approaches were used to oversee the case study activities in order to collect data. The collaborative social research approach was used to shelter the action research method. The research discovered that new business start-ups, as well as small businesses do not use social media and ICT-tools, unlike most of the large corporations use. At present, the current open-source ICT-technologies and social media applications are equally available for new and small businesses as they are available for larger companies. Successful implementation of social media and ICT-applications can easily enhance start-up performance and overcome business hassles. The thesis sheds some light on effective and innovative implementation of social media and ICT-applications for new business risk takers and small business birds. Key words
Resumo:
The last decade has shown that the global paper industry needs new processes and products in order to reassert its position in the industry. As the paper markets in Western Europe and North America have stabilized, the competition has tightened. Along with the development of more cost-effective processes and products, new process design methods are also required to break the old molds and create new ideas. This thesis discusses the development of a process design methodology based on simulation and optimization methods. A bi-level optimization problem and a solution procedure for it are formulated and illustrated. Computational models and simulation are used to illustrate the phenomena inside a real process and mathematical optimization is exploited to find out the best process structures and control principles for the process. Dynamic process models are used inside the bi-level optimization problem, which is assumed to be dynamic and multiobjective due to the nature of papermaking processes. The numerical experiments show that the bi-level optimization approach is useful for different kinds of problems related to process design and optimization. Here, the design methodology is applied to a constrained process area of a papermaking line. However, the same methodology is applicable to all types of industrial processes, e.g., the design of biorefiners, because the methodology is totally generalized and can be easily modified.
Resumo:
Technological developments in microprocessors and ICT landscape have made a shift to a new era where computing power is embedded in numerous small distributed objects and devices in our everyday lives. These small computing devices are ne-tuned to perform a particular task and are increasingly reaching our society at every level. For example, home appliances such as programmable washing machines, microwave ovens etc., employ several sensors to improve performance and convenience. Similarly, cars have on-board computers that use information from many di erent sensors to control things such as fuel injectors, spark plug etc., to perform their tasks e ciently. These individual devices make life easy by helping in taking decisions and removing the burden from their users. All these objects and devices obtain some piece of information about the physical environment. Each of these devices is an island with no proper connectivity and information sharing between each other. Sharing of information between these heterogeneous devices could enable a whole new universe of innovative and intelligent applications. The information sharing between the devices is a diffcult task due to the heterogeneity and interoperability of devices. Smart Space vision is to overcome these issues of heterogeneity and interoperability so that the devices can understand each other and utilize services of each other by information sharing. This enables innovative local mashup applications based on shared data between heterogeneous devices. Smart homes are one such example of Smart Spaces which facilitate to bring the health care system to the patient, by intelligent interconnection of resources and their collective behavior, as opposed to bringing the patient into the health system. In addition, the use of mobile handheld devices has risen at a tremendous rate during the last few years and they have become an essential part of everyday life. Mobile phones o er a wide range of different services to their users including text and multimedia messages, Internet, audio, video, email applications and most recently TV services. The interactive TV provides a variety of applications for the viewers. The combination of interactive TV and the Smart Spaces could give innovative applications that are personalized, context-aware, ubiquitous and intelligent by enabling heterogeneous systems to collaborate each other by sharing information between them. There are many challenges in designing the frameworks and application development tools for rapid and easy development of these applications. The research work presented in this thesis addresses these issues. The original publications presented in the second part of this thesis propose architectures and methodologies for interactive and context-aware applications, and tools for the development of these applications. We demonstrated the suitability of our ontology-driven application development tools and rule basedapproach for the development of dynamic, context-aware ubiquitous iTV applications.
Resumo:
The question of the trainability of executive functions and the impact of such training on related cognitive skills has stirred considerable research interest. Despite a number of studies investigating this, the question has not yet been solved. The general aim of this thesis was to investigate two very different types of training of executive functions: laboratory-based computerized training (Studies I-III) and realworld training through bilingualism (Studies IV-V). Bilingualism as a kind of training of executive functions is based on the idea that managing two languages requires executive resources, and previous studies have suggested a bilingual advantage in executive functions. Three executive functions were studied in the present thesis: updating of working memory (WM) contents, inhibition of irrelevant information, and shifting between tasks and mental sets. Studies I-III investigated the effects of computer-based training of WM updating (Study I), inhibition (Study II), and set shifting (Study III) in healthy young adults. All studies showed increased performance on the trained task. More importantly, improvement on an untrained task tapping the trained executive function (near transfer) was seen in Study I and II. None of the three studies showed improvement on untrained tasks tapping some other cognitive function (far transfer) as a result of training. Study I also used PET to investigate the effects of WM updating training on a neurotransmitter closely linked to WM, namely dopamine. The PET results revealed increased striatal dopamine release during WM updating performance as a result of training. Study IV investigated the ability to inhibit task-irrelevant stimuli in bilinguals and monolinguals by using a dichotic listening task. The results showed that the bilinguals exceeded the monolinguals in inhibiting task-irrelevant information. Study V introduced a new, complementary research approach to study the bilingual executive advantage and its underlying mechanisms. To circumvent the methodological problems related to natural groups design, this approach focuses only on bilinguals and examines whether individual differences in bilingual behavior correlate with executive task performances. Using measures that tap the three above-entioned executive functions, the results suggested that more frequent language switching was associated with better set shifting skills, and earlier acquisition of the second language was related to better inhibition skills. In conclusion, the present behavioral results showed that computer-based training of executive functions can improve performance on the trained task and on closely related tasks, but does not yield a more general improvement of cognitive skills. Moreover, the functional neuroimaging results reveal that WM training modulates striatal dopaminergic function, speaking for training-induced neural plasticity in this important neurotransmitter system. With regard to bilingualism, the results provide further support to the idea that bilingualism can enhance executive functions. In addition, the new complementary research approach proposed here provides some clues as to which aspects of everyday bilingual behavior may be related to the advantage in executive functions in bilingual individuals.
Resumo:
The ongoing global financial crisis has demonstrated the importance of a systemwide, or macroprudential, approach to safeguarding financial stability. An essential part of macroprudential oversight concerns the tasks of early identification and assessment of risks and vulnerabilities that eventually may lead to a systemic financial crisis. Thriving tools are crucial as they allow early policy actions to decrease or prevent further build-up of risks or to otherwise enhance the shock absorption capacity of the financial system. In the literature, three types of systemic risk can be identified: i ) build-up of widespread imbalances, ii ) exogenous aggregate shocks, and iii ) contagion. Accordingly, the systemic risks are matched by three categories of analytical methods for decision support: i ) early-warning, ii ) macro stress-testing, and iii ) contagion models. Stimulated by the prolonged global financial crisis, today's toolbox of analytical methods includes a wide range of innovative solutions to the two tasks of risk identification and risk assessment. Yet, the literature lacks a focus on the task of risk communication. This thesis discusses macroprudential oversight from the viewpoint of all three tasks: Within analytical tools for risk identification and risk assessment, the focus concerns a tight integration of means for risk communication. Data and dimension reduction methods, and their combinations, hold promise for representing multivariate data structures in easily understandable formats. The overall task of this thesis is to represent high-dimensional data concerning financial entities on lowdimensional displays. The low-dimensional representations have two subtasks: i ) to function as a display for individual data concerning entities and their time series, and ii ) to use the display as a basis to which additional information can be linked. The final nuance of the task is, however, set by the needs of the domain, data and methods. The following ve questions comprise subsequent steps addressed in the process of this thesis: 1. What are the needs for macroprudential oversight? 2. What form do macroprudential data take? 3. Which data and dimension reduction methods hold most promise for the task? 4. How should the methods be extended and enhanced for the task? 5. How should the methods and their extensions be applied to the task? Based upon the Self-Organizing Map (SOM), this thesis not only creates the Self-Organizing Financial Stability Map (SOFSM), but also lays out a general framework for mapping the state of financial stability. This thesis also introduces three extensions to the standard SOM for enhancing the visualization and extraction of information: i ) fuzzifications, ii ) transition probabilities, and iii ) network analysis. Thus, the SOFSM functions as a display for risk identification, on top of which risk assessments can be illustrated. In addition, this thesis puts forward the Self-Organizing Time Map (SOTM) to provide means for visual dynamic clustering, which in the context of macroprudential oversight concerns the identification of cross-sectional changes in risks and vulnerabilities over time. Rather than automated analysis, the aim of visual means for identifying and assessing risks is to support disciplined and structured judgmental analysis based upon policymakers' experience and domain intelligence, as well as external risk communication.
Resumo:
The Japanese quail Coturnix japonica originated from North Africa, Europe and Asia, is used worldwide as an experimental animal and model for aviculture. The current paper characterizes Eimeria bateri, Eimeria tsunodai and Eimeria uzura recovered from C. japonica. Based on the fact that quails have a global distribution, as are their coccidia, the findings of this study should provide the means for diagnosis of those Eimeria spp. in other regions and continents. Eimeria bateri showed the greatest intensity of infection and shed oocysts from the fourth day after infection; in contrast, E. tsunodai and E. uzura shed oocysts from the fifth day after infection. The three species shared a high degree of similarity and were all polymorphic. Yet, the application of line regressions, histograms and ANOVA provided means for the identification of these species. Finally, the algorithm was very efficient since verified that resultant values were not superimposed.
Resumo:
This work deals with an hybrid PID+fuzzy logic controller applied to control the machine tool biaxial table motions. The non-linear model includes backlash and the axis elasticity. Two PID controllers do the primary table control. A third PID+fuzzy controller has a cross coupled structure whose function is to minimise the trajectory contour errors. Once with the three PID controllers tuned, the system is simulated with and without the third controller. The responses results are plotted and compared to analyse the effectiveness of this hybrid controller over the system. They show that the proposed methodology reduces the contour error in a proportion of 70:1.
Resumo:
The evolution of digital circuit technology, leadind to higher speeds and more reliability allowed the development of machine controllers adapted to new production systems (e.g., Flexible Manufacturing Systems - FMS). Most of the controllers are developed in agreement with the CNC technology of the correspondent machine tool manufacturer. Any alterations or adaptation of their components are not easy to be implemented. The machine designers face up hardware and software restrictions such as lack of interaction among system's elements and impossibility of adding new function. This is due to hardware incompatibility and to software not allowing alterations in the source program. The introduction of open architecture philosophy propitiated the evolution of a new generation of numeric controllers. This brought the conventional CNC technology to the standard IBM - PC microcomputer. As a consequence, the characteristics of the CNC (positioning) and the microcomputer (easy of programming, system configuration, network communication etc) are combined. Some researchers have addressed a flexible structure of software and hardware allowing changes in the hardware basic configuration and all control software levels. In this work, the development of open architecture controllers in the OSACA, OMAC, HOAM-CNC and OSEC architectures is described.
Resumo:
This work presents the implementation and comparison of three different techniques of three-dimensional computer vision as follows: • Stereo vision - correlation between two 2D images • Sensorial fusion - use of different sensors: camera 2D + ultrasound sensor (1D); • Structured light The computer vision techniques herein presented took into consideration the following characteristics: • Computational effort ( elapsed time for obtain the 3D information); • Influence of environmental conditions (noise due to a non uniform lighting, overlighting and shades); • The cost of the infrastructure for each technique; • Analysis of uncertainties, precision and accuracy. The option of using the Matlab software, version 5.1, for algorithm implementation of the three techniques was due to the simplicity of their commands, programming and debugging. Besides, this software is well known and used by the academic community, allowing the results of this work to be obtained and verified. Examples of three-dimensional vision applied to robotic assembling tasks ("pick-and-place") are presented.
Resumo:
Magaly Basconesin esitys Kirjastoverkkopäivillä 24.10.2013 Helsingissä.
Resumo:
Drug discovery is a continuous process where researchers are constantly trying to find new and better drugs for the treatment of various conditions. Alzheimer’s disease, a neurodegenerative disease mostly affecting the elderly, has a complex etiology with several possible drug targets. Some of these targets have been known for years while other new targets and theories have emerged more recently. Cholinesterase inhibitors are the major class of drugs currently used for the symptomatic treatment of Alzheimer’s disease. In the Alzheimer’s disease brain there is a deficit of acetylcholine and an impairment in signal transmission. Acetylcholinesterase has therefore been the main target as this is the main enzyme hydrolysing acetylcholine and ending neurotransmission. It is believed that by inhibiting acetylcholinesterase the cholinergic signalling can be enhanced and the cognitive symptoms that arise in Alzheimer’s disease can be improved. Butyrylcholinesterase, the second enzyme of the cholinesterase family, has more recently attracted interest among researchers. Its function is still not fully known, but it is believed to play a role in several diseases, one of them being Alzheimer’s disease. In this contribution the aim has primarily been to identify butyrylcholinesterase inhibitors to be used as drug molecules or molecular probes in the future. Both synthetic and natural compounds in diverse and targeted screening libraries have been used for this purpose. The active compounds have been further characterized regarding their potencies, cytotoxicity, and furthermore, in two of the publications, the inhibitors ability to also inhibit Aβ aggregation in an attempt to discover bifunctional compounds. Further, in silico methods were used to evaluate the binding position of the active compounds with the enzyme targets. Mostly to differentiate between the selectivity towards acetylcholinesterase and butyrylcholinesterase, but also to assess the structural features required for enzyme inhibition. We also evaluated the compounds, active and non-active, in chemical space using the web-based tool ChemGPS-NP to try and determine the relevant chemical space occupied by cholinesterase inhibitors. In this study, we have succeeded in finding potent butyrylcholinesterase inhibitors with a diverse set of structures, nine chemical classes in total. In addition, some of the compounds are bifunctional as they also inhibit Aβ aggregation. The data gathered from all publications regarding the chemical space occupied by butyrylcholinesterase inhibitors we believe will give an insight into the chemically active space occupied by this type of inhibitors and will hopefully facilitate future screening and result in an even deeper knowledge of butyrylcholinesterase inhibitors.
Resumo:
Enabling Change in Universities: Enhancing Education for Sustainable Development with Tools for Quality Assurance This thesis deals with enabling change in universities, more explicitly enhancing education for sustainable development with tools for quality assurance. Change management is a discipline within management that was developed in the 1980s because business changed from being predictable to unpredictable. The PEST mnemonic is a method to categorize factors enabling change; such as political, economic, socio-cultural and technological factors, which all affect higher education. A classification of a change, in either hard or soft, can help understanding the type of change that an organization is facing. Hard changes are more applied to problems that have clear objectives and indicators, with a known cause of the problem. Soft changes are applied to larger problems that affect the entire organization or beyond it. The basic definition for sustainable development is: the future generations should have similar opportunities as the previous. The UN has set as a global goal an integration of education for sustainable development (ESD) at all levels of education during 2005- 2014. The goal is set also in universities, the graduates of which are future leaders for all labor markets. The objective for ESD in higher education is that graduates obtain the competence to take economic, social and environmental costs and benefits into account when making decisions. Knowledge outcomes should aim for systematic and holistic thinking, which requires cross disciplinary education. So far, the development of ESD has not achieved its goals. The UN has identified a need for more transdisclipnary research in ESD. A joint global requirement for universities is quality assurance, the aim of which is to secure and improve teaching and learning. Quality, environmental and integrated management systems are used by some universities for filling the quality assurance requirements. The goal of this thesis is to open up new ways for enhancing ESD in universities, beyond the forerunners; by exploring how management systems could be used as tools for promoting ESD. The thesis is based on five studies. In the first study, I focus on if and how tools for quality assurance could be benefitted for promoting ESD. It is written from a new perspective, the memetic, for reaching a diversity of faculty. A meme is an idea that diffuses from brain to brain. It can be applied for cultural evolution. It is a theory that is based on the evolutionary theory by Darwin, applied for social sciences. In the second Paper, I present the results from the development of the pilot process model for enhancing ESD with management systems. The development of the model is based on a study that includes earlier studies, a survey in academia and an analysis of the practice in 11 universities in the Nordic countries. In the third study, I explore if the change depends on national culture or if it is global. It is a comparative study on both policy and implementation level, between the Nordic countries and China. The fourth study is a single case study based on change management. In this study, I identify what to consider in order to enable the change: enhancing ESD with tools for quality assurance in universities. In the fifth Paper, I present the results of the process model for enhancing ESD with management systems. The model was compared with identified drivers and barriers for enhancing ESD and for implementing management systems. Finally, the process model was piloted and applied for identifying sustainability aspects in curricula. Action research was chosen as methodology because there are not already implemented approaches using quality management for promoting ESD, why the only way to study this is to make it happen. Another reason for choosing action research is since it is essential to involve students and faculty for enhancing ESD. Action based research consists of the following phases: a) diagnosing, b) planning action, c) taking action and d) evaluating action. This research was made possible by a project called Education for Sustainable Development in Academia in the Nordic countries, ESDAN, in which activities were divided into these four phases. Each phase ended with an open seminar, where the results of the study were presented. The objective for the research project was to develop a process for including knowledge in sustainable development in curricula, which could be used in the quality assurance work. Eleven universities from the Nordic countries cooperated in the project. The aim was, by applying the process, to identify and publish examples of relevant sustainability aspects in different degree programs in universities in the Nordic countries. The project was partly financed by the Nordic Council of Ministers and partly by the participating pilot universities. Based on the results of my studies, I consider that quality, environmental and integrated management systems can be used for promoting ESD in universities. Relevant sustainability aspects have been identified in different fields of studies by applying the final process model. The final process model was compared with drivers and barriers for enhancing ESD and for implementing management systems in universities and with succeeding with management systems in industry. It corresponds with these, meaning that drivers are taken into account and barriers tackled. Both ESD and management systems in universities could be considered successful memes, which can reflect an effective way of communication among individuals. I have identified that management systems could be used as tools for hard changes and to support the soft change of enhancing ESD in universities with management system. Based on the change management study I have summarized recommendations on what to consider in order to enable the studied change. The main practical implications of the results are that the process model could be applied for assessment, benchmarking and communication of ESD, connected to quality assurance, when applied. This is possible because the information can be assembled in one picture, which facilitates comparison. The memetic approach can be applied for structuring. It is viable to make comparative studies between cultures, for getting insight in special characteristics of the own culture. Action based research is suitable for involving faculty. Change management can be applied for planning a change, which both enhancing ESD and developing management systems are identified to be.
Resumo:
A web service is a software system that provides a machine-processable interface to the other machines over the network using different Internet protocols. They are being increasingly used in the industry in order to automate different tasks and offer services to a wider audience. The REST architectural style aims at producing scalable and extensible web services using technologies that play well with the existing tools and infrastructure of the web. It provides a uniform set of operation that can be used to invoke a CRUD interface (create, retrieve, update and delete) of a web service. The stateless behavior of the service interface requires that every request to a resource is independent of the previous ones facilitating scalability. Automated systems, e.g., hotel reservation systems, provide advanced scenarios for stateful services that require a certain sequence of requests that must be followed in order to fulfill the service goals. Designing and developing such services for advanced scenarios with REST constraints require rigorous approaches that are capable of creating web services that can be trusted for their behavior. Systems that can be trusted for their behavior can be termed as dependable systems. This thesis presents an integrated design, analysis and validation approach that facilitates the service developer to create dependable and stateful REST web services. The main contribution of this thesis is that we provide a novel model-driven methodology to design behavioral REST web service interfaces and their compositions. The behavioral interfaces provide information on what methods can be invoked on a service and the pre- and post-conditions of these methods. The methodology uses Unified Modeling Language (UML), as the modeling language, which has a wide user base and has mature tools that are continuously evolving. We have used UML class diagram and UML state machine diagram with additional design constraints to provide resource and behavioral models, respectively, for designing REST web service interfaces. These service design models serve as a specification document and the information presented in them have manifold applications. The service design models also contain information about the time and domain requirements of the service that can help in requirement traceability which is an important part of our approach. Requirement traceability helps in capturing faults in the design models and other elements of software development environment by tracing back and forth the unfulfilled requirements of the service. The information about service actors is also included in the design models which is required for authenticating the service requests by authorized actors since not all types of users have access to all the resources. In addition, following our design approach, the service developer can ensure that the designed web service interfaces will be REST compliant. The second contribution of this thesis is consistency analysis of the behavioral REST interfaces. To overcome the inconsistency problem and design errors in our service models, we have used semantic technologies. The REST interfaces are represented in web ontology language, OWL2, that can be part of the semantic web. These interfaces are used with OWL 2 reasoners to check unsatisfiable concepts which result in implementations that fail. This work is fully automated thanks to the implemented translation tool and the existing OWL 2 reasoners. The third contribution of this thesis is the verification and validation of REST web services. We have used model checking techniques with UPPAAL model checker for this purpose. The timed automata of UML based service design models are generated with our transformation tool that are verified for their basic characteristics like deadlock freedom, liveness, reachability and safety. The implementation of a web service is tested using a black-box testing approach. Test cases are generated from the UPPAAL timed automata and using the online testing tool, UPPAAL TRON, the service implementation is validated at runtime against its specifications. Requirement traceability is also addressed in our validation approach with which we can see what service goals are met and trace back the unfulfilled service goals to detect the faults in the design models. A final contribution of the thesis is an implementation of behavioral REST interfaces and service monitors from the service design models. The partial code generation tool creates code skeletons of REST web services with method pre and post-conditions. The preconditions of methods constrain the user to invoke the stateful REST service under the right conditions and the post condition constraint the service developer to implement the right functionality. The details of the methods can be manually inserted by the developer as required. We do not target complete automation because we focus only on the interface aspects of the web service. The applicability of the approach is demonstrated with a pedagogical example of a hotel room booking service and a relatively complex worked example of holiday booking service taken from the industrial context. The former example presents a simple explanation of the approach and the later worked example shows how stateful and timed web services offering complex scenarios and involving other web services can be constructed using our approach.