967 resultados para IT tools
Resumo:
Herein, we report the formation of organized mesoporous silica materials prepared from a novel nonionic gemini surfactant, myristoyl-end capped Jeffamine, synthesized from a polyoxyalkyleneamine (ED900). The behavior of the modified Jeffamine in water was first investigated. A direct micellar phase (L1) and a hexagonal (H1) liquid crystal were found. The structure of the micelles was investigated from the SAXS and the analysis by Generalized Indirect Fourier Transformation (GIFT), which show that the particles are globular of coreshell type. The myristoyl chains, located at the ends of the amphiphile molecule are assembled to form the core of the micelles and, as a consequence, the molecules are folded over on themselves. Mesoporous materials were then synthesized from the self-assembly mechanism. The recovered materials were characterized by SAXS measurements, nitrogen adsorptiondesorption analysis, transmission and scanning electron microscopy. The results clearly evidence that by modifying the synthesis parameters, such as the surfactant/silica precursor molar ratio and the hydrothermal conditions, one can control the size and the nanostructuring of the resulting material. It was observed that, the lower the temperature of the hydrothermal treatment, the better the mesopore ordering.
Resumo:
This article is the result of an ongoing research into a variety of features of Spanish local government. It aims, in particular, at providing a profile of the tools implemented by local authorities to improve local democracy in Catalonia. The main hypothesis of the work is that, even though the Spanish local model is constrained by a shared and unique set of legal regulations, local institutions in Catalonia have developed their own model of local participation. And the range of instruments like these is still now increasing. More specifically, the scope of this research is twofold. On the one hand, different types of instruments for public deliberation in the Catalan local administration system are identified and presented, based on the place they take in the policy cycle. On the other hand, we focus on policy domains and the quality of the decision-making processes. Researching the stability of the participation tools or whether local democracy prefers more 'ad hoc' processes allows us to analyze the boundaries/limits of local democracy in Catalonia. The main idea underlying this paper is that, despite the existence of a single legal model regulating municipalities in Catalonia, local authorities tend to use their legally granted selfmanagement capacities to design their own instruments which end up presenting perceivable distinct features, stressing democracy in different policy domains, and in diverse policy cycles. Therefore, this paper is intended to identify such models and to provide factors (variables) so that an explanatory model can be built.
Resumo:
This thesis concentrates on studying the operational disturbance behavior of machine tools integrated into FMS. Operational disturbances are short term failures of machine tools which are especially disruptive to unattended or unmanned operation of FMS. The main objective was to examine the effect of operational disturbances on reliability and operation time distribution for machine tools. The theoretical part of the thesis covers the fimdamentals of FMS relating to the subject of this study. The concept of FMS, its benefits and operator's role in FMS operation are reviewed. The importance of reliability is presented. The terms describing the operation time of machine tools are formed by adopting standards and references. The concept of failure and indicators describing reliability and operational performance for machine tools in FMSs are presented. The empirical part of the thesis describes the research methodology which is a combination of automated (ADC) and manual data collection. By using this methodology it is possible to have a complete view of the operation time distribution for studied machine tools. Data collection was carried out in four FMSs consisting of a total of 17 machine tools. Each FMS's basic features and the signals of ADC are described. The indicators describing the reliability and operation time distribution of machine tools were calculated according to collected data. The results showed that operational disturbances have a significant influence on machine tool reliability and operational performance. On average, an operational disturbance occurs every 8,6 hours of operation time and has a down time of 0,53 hours. Operational disturbances cause a 9,4% loss in operation time which is twice the amount of losses caused by technical failures (4,3%). Operational disturbances have a decreasing influence on the utilization rate. A poor operational disturbance behavior decreases the utilization rate. It was found that the features of a part family to be machined and the method technology related to it are defining the operational disturbance behavior of the machine tool. Main causes for operational disturbances were related to material quality variations, tool maintenance, NC program errors, ATC and machine tool control. Operator's role was emphasized. It was found that failure recording activity of the operators correlates with the utilization rate. The more precisely the operators record the failure, the higher is the utilization rate. Also the FMS organizations which record failures more precisely have fewer operational disturbances.
Resumo:
Workflow management systems aim at the controlled execution of complex application processes in distributed and heterogeneous environments. These systems will shape the structure of information systems in business and non-business environments. E business and system integration is a fertile soil for WF and groupware tools. This thesis aims to study WF and groupware tools in order to gather in house knowledge of WF to better utilize WF solutions in future, and to focus on SAP Business Workflow in order to find a global solution for Application Link Enabling support for system integration. Piloting this solution in Nokia collects the experience of SAP R/3 WF tool for other development projects in future. The literary part of this study will guide to the world of business process automation providing a general description of the history, use and potentials of WF & groupware software. The empirical part of this study begins with the background of the case study describing the IT environment initiating the case by introducing SAP R/3 in Nokia, the communication technique in use and WF tool. Case study is focused in one solution with SAP Business Workflow. This study provides a concept to monitor communication between ERP systems and to increase the quality of system integration. Case study describes a way create support model for ALE/EDI interfaces. Support model includes monitoring organization and the workflow processes to solve the most common IDoc related errors.
Resumo:
Software testing is one of the essential parts in software engineering process. The objective of the study was to describe software testing tools and the corresponding use. The thesis contains examples of software testing tools usage. The study was conducted as a literature study, with focus on current software testing practices and quality assurance standards. In the paper a tool classifier was employed, and testing tools presented in study were classified according to it. We found that it is difficult to distinguish current available tools by certain testing activities as many of them contain functionality that exceeds scopes of a single testing type.
Resumo:
The use of domain-specific languages (DSLs) has been proposed as an approach to cost-e ectively develop families of software systems in a restricted application domain. Domain-specific languages in combination with the accumulated knowledge and experience of previous implementations, can in turn be used to generate new applications with unique sets of requirements. For this reason, DSLs are considered to be an important approach for software reuse. However, the toolset supporting a particular domain-specific language is also domain-specific and is per definition not reusable. Therefore, creating and maintaining a DSL requires additional resources that could be even larger than the savings associated with using them. As a solution, di erent tool frameworks have been proposed to simplify and reduce the cost of developments of DSLs. Developers of tool support for DSLs need to instantiate, customize or configure the framework for a particular DSL. There are di erent approaches for this. An approach is to use an application programming interface (API) and to extend the basic framework using an imperative programming language. An example of a tools which is based on this approach is Eclipse GEF. Another approach is to configure the framework using declarative languages that are independent of the underlying framework implementation. We believe this second approach can bring important benefits as this brings focus to specifying what should the tool be like instead of writing a program specifying how the tool achieves this functionality. In this thesis we explore this second approach. We use graph transformation as the basic approach to customize a domain-specific modeling (DSM) tool framework. The contributions of this thesis includes a comparison of di erent approaches for defining, representing and interchanging software modeling languages and models and a tool architecture for an open domain-specific modeling framework that e ciently integrates several model transformation components and visual editors. We also present several specific algorithms and tool components for DSM framework. These include an approach for graph query based on region operators and the star operator and an approach for reconciling models and diagrams after executing model transformation programs. We exemplify our approach with two case studies MICAS and EFCO. In these studies we show how our experimental modeling tool framework has been used to define tool environments for domain-specific languages.
Resumo:
Prostate-specific antigen (PSA) is a marker that is commonly used in estimating prostate cancer risk. Prostate cancer is usually a slowly progressing disease, which might not cause any symptoms whatsoever. Nevertheless, some cases of cancer are aggressive and need to be treated before they become life-threatening. However, the blood PSA concentration may rise also in benign prostate diseases and using a single total PSA (tPSA) measurement to guide the decision on further examinations leads to many unnecessary biopsies, over-detection, and overtreatment of indolent cancers which would not require treatment. Therefore, there is a need for markers that would better separate cancer from benign disorders, and would also predict cancer aggressiveness. The aim of this study was to evaluate whether intact and nicked forms of free PSA (fPSA-I and fPSA-N) or human kallikrein-related peptidase 2 (hK2) could serve as new tools in estimating prostate cancer risk. First, the immunoassays for fPSA-I and free and total hK2 were optimized so that they would be less prone to assay interference caused by interfering factors present in some blood samples. The optimized assays were shown to work well and were used to study the marker concentrations in the clinical sample panels. The marker levels were measured from preoperative blood samples of prostate cancer patients scheduled for radical prostatectomy. The association of the markers with the cancer stage and grade was studied. It was found that among all tested markers and their combinations especially the ratio of fPSA-N to tPSA and ratio of free PSA (fPSA) to tPSA were associated with both cancer stage and grade. They might be useful in predicting the cancer aggressiveness, but further follow-up studies are necessary to fully evaluate the significance of the markers in this clinical setting. The markers tPSA, fPSA, fPSA-I and hK2 were combined in a statistical model which was previously shown to be able to reduce unnecessary biopsies when applied to large screening cohorts of men with elevated tPSA. The discriminative accuracy of this model was compared to models based on established clinical predictors in reference to biopsy outcome. The kallikrein model and the calculated fPSA-N concentrations (fPSA minus fPSA-I) correlated with the prostate volume and the model, when compared to the clinical models, predicted prostate cancer in biopsy equally well. Hence, the measurement of kallikreins in a blood sample could be used to replace the volume measurement which is time-consuming, needs instrumentation and skilled personnel and is an uncomfortable procedure. Overall, the model could simplify the estimation of prostate cancer risk. Finally, as the fPSA-N seems to be an interesting new marker, a direct immunoassay for measuring fPSA-N concentrations was developed. The analytical performance was acceptable, but the rather complicated assay protocol needs to be improved until it can be used for measuring large sample panels.
Resumo:
In the Innovation Union Scoreboard of 2011, Latvia ranked last amongst the EU countries in innovation performance. Even though there is sufficient scientific and technological basis, the results remain modest or low in most of the indicators concerning innovations. Several aspects influence the performance a national innovation system. In Latvia, the low effectiveness is often attributed to lack of financial support tools. As a comparison, Finland was chosen because of its well-established and documented innovation system. The aim of this study is to research the efficiency and effectiveness of the current financial innovation support tool system in Latvia from the point of view of an innovating company. It also attempts to analyze the support tool system of Latvia and compare to the relevant parts of the Finnish system. The study found that it is problematic for innovative companies in Latvia to receive the necessary funding especially for start-ups and SMEs due to the low number of grant programs, funds and lacking offer from banks, venture capital and business angels. To improve the situation, the Latvian government should restructure the funding mechanisms putting a bigger emphasis on innovative start-ups and SMEs. That would lay a foundation for future growth and boost research and scientific activities in Latvia.
Resumo:
The current research emphasizes on various questions raised and deliberated upon by different entrepreneurs. It provides a valuable contribution to comprehend the importance of social media and ICT-applications. Furthermore, it demonstrates how to support and implement the management consulting and business coaching start-ups with the help of social media and ICT-tools. The thesis presents a literary review from different information systems science, SME and e-business journals, web articles, as well as, survey analysis reports on social media applications. The methodology incorporated into a qualitative research method in which social anthropological approaches were used to oversee the case study activities in order to collect data. The collaborative social research approach was used to shelter the action research method. The research discovered that new business start-ups, as well as small businesses do not use social media and ICT-tools, unlike most of the large corporations use. At present, the current open-source ICT-technologies and social media applications are equally available for new and small businesses as they are available for larger companies. Successful implementation of social media and ICT-applications can easily enhance start-up performance and overcome business hassles. The thesis sheds some light on effective and innovative implementation of social media and ICT-applications for new business risk takers and small business birds. Key words
Resumo:
The last decade has shown that the global paper industry needs new processes and products in order to reassert its position in the industry. As the paper markets in Western Europe and North America have stabilized, the competition has tightened. Along with the development of more cost-effective processes and products, new process design methods are also required to break the old molds and create new ideas. This thesis discusses the development of a process design methodology based on simulation and optimization methods. A bi-level optimization problem and a solution procedure for it are formulated and illustrated. Computational models and simulation are used to illustrate the phenomena inside a real process and mathematical optimization is exploited to find out the best process structures and control principles for the process. Dynamic process models are used inside the bi-level optimization problem, which is assumed to be dynamic and multiobjective due to the nature of papermaking processes. The numerical experiments show that the bi-level optimization approach is useful for different kinds of problems related to process design and optimization. Here, the design methodology is applied to a constrained process area of a papermaking line. However, the same methodology is applicable to all types of industrial processes, e.g., the design of biorefiners, because the methodology is totally generalized and can be easily modified.
Resumo:
The starting point of this study is that the prevailing way to consider the Finnish IT industries and industry information often results in a limited and even skewed picture of the sector. The purpose of the study is to contribute and increase knowledge and understanding of the status, structure and evolution of the Finnish IT industries as well as the Finnish IT vendor field and competition. The focus is on software product and IT services industries which form a crucial part of all ICT industries. This study examines the Finnish IT sector from production (supply) as well as market (demand) perspective. The study is based on empirical information from multiple sources. Three research questions were formulated for the study. The first concerns the status of the Finnish IT industries considered by applying theoretical frameworks. The second research question targets at the basis for the future evolution of the Finnish IT industries and, finally, the third at the ability of the available definitions and indicators to describe the Finnish IT industries and IT markets. Major structural changes like technological changes and related innovations, globalization and new business models are drivers of the evolution of the IT industries. The findings of this study emphasize the significant role of IT services in the Finnish IT sector and in connection to that the ability to combine IT service skills, competences and practices with high level software skills also in the future. According to the study the Finnish IT enterprises and their customers have become increasingly dependent on global ecosystems and platforms, applications and IT services provided by global vendors. As a result, more IT decisions are made outside Finland. In addition, IT companies are facing new competition from other than IT industries bringing into market new substitutes. To respond to the new competition, IT firms seek growth by expanding beyond their traditional markets.. The changing global division of labor accentuates the need for accurate information of the IT sector but, at the same time, also makes it increasingly challenging to acquire the information needed. One of the main contributions of this study is to provide frameworks for describing the Finnish IT sector and its evolution. These frameworks help combine empirical information from various sources and make it easier to concretize the structures, volumes, relationships and interaction of both, the production and market side of the Finnish IT industry. Some frameworks provide tools to analyze the vendor field, competition and the basis for the future evolution of the IT industries. The observations of the study support the argument that static industry definitions and related classifications do not serve the information needs in dynamic industries, such as the IT industries. One of the main messages of this study is to emphasize the importance of understanding the definitions and starting points of different information sources. Simultaneously, in the structure and evolution of Finnish IT industries the number of employees has become a more valid and reliable measure than the revenue based indicators.
Resumo:
Technological developments in microprocessors and ICT landscape have made a shift to a new era where computing power is embedded in numerous small distributed objects and devices in our everyday lives. These small computing devices are ne-tuned to perform a particular task and are increasingly reaching our society at every level. For example, home appliances such as programmable washing machines, microwave ovens etc., employ several sensors to improve performance and convenience. Similarly, cars have on-board computers that use information from many di erent sensors to control things such as fuel injectors, spark plug etc., to perform their tasks e ciently. These individual devices make life easy by helping in taking decisions and removing the burden from their users. All these objects and devices obtain some piece of information about the physical environment. Each of these devices is an island with no proper connectivity and information sharing between each other. Sharing of information between these heterogeneous devices could enable a whole new universe of innovative and intelligent applications. The information sharing between the devices is a diffcult task due to the heterogeneity and interoperability of devices. Smart Space vision is to overcome these issues of heterogeneity and interoperability so that the devices can understand each other and utilize services of each other by information sharing. This enables innovative local mashup applications based on shared data between heterogeneous devices. Smart homes are one such example of Smart Spaces which facilitate to bring the health care system to the patient, by intelligent interconnection of resources and their collective behavior, as opposed to bringing the patient into the health system. In addition, the use of mobile handheld devices has risen at a tremendous rate during the last few years and they have become an essential part of everyday life. Mobile phones o er a wide range of different services to their users including text and multimedia messages, Internet, audio, video, email applications and most recently TV services. The interactive TV provides a variety of applications for the viewers. The combination of interactive TV and the Smart Spaces could give innovative applications that are personalized, context-aware, ubiquitous and intelligent by enabling heterogeneous systems to collaborate each other by sharing information between them. There are many challenges in designing the frameworks and application development tools for rapid and easy development of these applications. The research work presented in this thesis addresses these issues. The original publications presented in the second part of this thesis propose architectures and methodologies for interactive and context-aware applications, and tools for the development of these applications. We demonstrated the suitability of our ontology-driven application development tools and rule basedapproach for the development of dynamic, context-aware ubiquitous iTV applications.
Resumo:
Few people see both opportunities and threats coming from IT legacy in current world. On one hand, effective legacy management can bring substantial hard savings and smooth transition to the desired future state. On the other hand, its mismanagement contributes to serious operational business risks, as old systems are not as reliable as it is required by the business users. This thesis offers one perspective of dealing with IT legacy – through effective contract management, as a component towards achieving Procurement Excellence in IT, thus bridging IT delivery departments, IT procurement, business units, and suppliers. It developed a model for assessing the impact of improvements on contract management process and set of tools and advices with regards to analysis and improvement actions. The thesis conducted case study to present and justify the implementation of Lean Six Sigma in IT legacy contract management environment. Lean Six Sigma proved to be successful and this thesis presents and discusses all the steps necessary, and pitfalls to avoid, to achieve breakthrough improvement in IT contract management process performance. For the IT legacy contract management process two improvements require special attention and can be easily copied to any organization. First is the issue of diluted contract ownership that stops all the improvements, as people do not know who is responsible for performing those actions. Second is the contract management performance evaluation tool, which can be used for monitoring, identifying outlying contracts and opportunities for improvements in the process. The study resulted in a valuable insight on the benefits of applying Lean Six Sigma to improve IT legacy contract management, as well as on how Lean Six Sigma can be applied in IT environment. Managerial implications are discussed. It is concluded that the use of data-driven Lean Six Sigma methodology for improving the existing IT contract management processes is a significant addition to the existing best practices in contract management.
Resumo:
Drug discovery is a continuous process where researchers are constantly trying to find new and better drugs for the treatment of various conditions. Alzheimer’s disease, a neurodegenerative disease mostly affecting the elderly, has a complex etiology with several possible drug targets. Some of these targets have been known for years while other new targets and theories have emerged more recently. Cholinesterase inhibitors are the major class of drugs currently used for the symptomatic treatment of Alzheimer’s disease. In the Alzheimer’s disease brain there is a deficit of acetylcholine and an impairment in signal transmission. Acetylcholinesterase has therefore been the main target as this is the main enzyme hydrolysing acetylcholine and ending neurotransmission. It is believed that by inhibiting acetylcholinesterase the cholinergic signalling can be enhanced and the cognitive symptoms that arise in Alzheimer’s disease can be improved. Butyrylcholinesterase, the second enzyme of the cholinesterase family, has more recently attracted interest among researchers. Its function is still not fully known, but it is believed to play a role in several diseases, one of them being Alzheimer’s disease. In this contribution the aim has primarily been to identify butyrylcholinesterase inhibitors to be used as drug molecules or molecular probes in the future. Both synthetic and natural compounds in diverse and targeted screening libraries have been used for this purpose. The active compounds have been further characterized regarding their potencies, cytotoxicity, and furthermore, in two of the publications, the inhibitors ability to also inhibit Aβ aggregation in an attempt to discover bifunctional compounds. Further, in silico methods were used to evaluate the binding position of the active compounds with the enzyme targets. Mostly to differentiate between the selectivity towards acetylcholinesterase and butyrylcholinesterase, but also to assess the structural features required for enzyme inhibition. We also evaluated the compounds, active and non-active, in chemical space using the web-based tool ChemGPS-NP to try and determine the relevant chemical space occupied by cholinesterase inhibitors. In this study, we have succeeded in finding potent butyrylcholinesterase inhibitors with a diverse set of structures, nine chemical classes in total. In addition, some of the compounds are bifunctional as they also inhibit Aβ aggregation. The data gathered from all publications regarding the chemical space occupied by butyrylcholinesterase inhibitors we believe will give an insight into the chemically active space occupied by this type of inhibitors and will hopefully facilitate future screening and result in an even deeper knowledge of butyrylcholinesterase inhibitors.