988 resultados para Shoe last design
Resumo:
3 Summary 3. 1 English The pharmaceutical industry has been facing several challenges during the last years, and the optimization of their drug discovery pipeline is believed to be the only viable solution. High-throughput techniques do participate actively to this optimization, especially when complemented by computational approaches aiming at rationalizing the enormous amount of information that they can produce. In siiico techniques, such as virtual screening or rational drug design, are now routinely used to guide drug discovery. Both heavily rely on the prediction of the molecular interaction (docking) occurring between drug-like molecules and a therapeutically relevant target. Several softwares are available to this end, but despite the very promising picture drawn in most benchmarks, they still hold several hidden weaknesses. As pointed out in several recent reviews, the docking problem is far from being solved, and there is now a need for methods able to identify binding modes with a high accuracy, which is essential to reliably compute the binding free energy of the ligand. This quantity is directly linked to its affinity and can be related to its biological activity. Accurate docking algorithms are thus critical for both the discovery and the rational optimization of new drugs. In this thesis, a new docking software aiming at this goal is presented, EADock. It uses a hybrid evolutionary algorithm with two fitness functions, in combination with a sophisticated management of the diversity. EADock is interfaced with .the CHARMM package for energy calculations and coordinate handling. A validation was carried out on 37 crystallized protein-ligand complexes featuring 11 different proteins. The search space was defined as a sphere of 15 R around the center of mass of the ligand position in the crystal structure, and conversely to other benchmarks, our algorithms was fed with optimized ligand positions up to 10 A root mean square deviation 2MSD) from the crystal structure. This validation illustrates the efficiency of our sampling heuristic, as correct binding modes, defined by a RMSD to the crystal structure lower than 2 A, were identified and ranked first for 68% of the complexes. The success rate increases to 78% when considering the five best-ranked clusters, and 92% when all clusters present in the last generation are taken into account. Most failures in this benchmark could be explained by the presence of crystal contacts in the experimental structure. EADock has been used to understand molecular interactions involved in the regulation of the Na,K ATPase, and in the activation of the nuclear hormone peroxisome proliferatoractivated receptors a (PPARa). It also helped to understand the action of common pollutants (phthalates) on PPARy, and the impact of biotransformations of the anticancer drug Imatinib (Gleevec®) on its binding mode to the Bcr-Abl tyrosine kinase. Finally, a fragment-based rational drug design approach using EADock was developed, and led to the successful design of new peptidic ligands for the a5ß1 integrin, and for the human PPARa. In both cases, the designed peptides presented activities comparable to that of well-established ligands such as the anticancer drug Cilengitide and Wy14,643, respectively. 3.2 French Les récentes difficultés de l'industrie pharmaceutique ne semblent pouvoir se résoudre que par l'optimisation de leur processus de développement de médicaments. Cette dernière implique de plus en plus. de techniques dites "haut-débit", particulièrement efficaces lorsqu'elles sont couplées aux outils informatiques permettant de gérer la masse de données produite. Désormais, les approches in silico telles que le criblage virtuel ou la conception rationnelle de nouvelles molécules sont utilisées couramment. Toutes deux reposent sur la capacité à prédire les détails de l'interaction moléculaire entre une molécule ressemblant à un principe actif (PA) et une protéine cible ayant un intérêt thérapeutique. Les comparatifs de logiciels s'attaquant à cette prédiction sont flatteurs, mais plusieurs problèmes subsistent. La littérature récente tend à remettre en cause leur fiabilité, affirmant l'émergence .d'un besoin pour des approches plus précises du mode d'interaction. Cette précision est essentielle au calcul de l'énergie libre de liaison, qui est directement liée à l'affinité du PA potentiel pour la protéine cible, et indirectement liée à son activité biologique. Une prédiction précise est d'une importance toute particulière pour la découverte et l'optimisation de nouvelles molécules actives. Cette thèse présente un nouveau logiciel, EADock, mettant en avant une telle précision. Cet algorithme évolutionnaire hybride utilise deux pressions de sélections, combinées à une gestion de la diversité sophistiquée. EADock repose sur CHARMM pour les calculs d'énergie et la gestion des coordonnées atomiques. Sa validation a été effectuée sur 37 complexes protéine-ligand cristallisés, incluant 11 protéines différentes. L'espace de recherche a été étendu à une sphère de 151 de rayon autour du centre de masse du ligand cristallisé, et contrairement aux comparatifs habituels, l'algorithme est parti de solutions optimisées présentant un RMSD jusqu'à 10 R par rapport à la structure cristalline. Cette validation a permis de mettre en évidence l'efficacité de notre heuristique de recherche car des modes d'interactions présentant un RMSD inférieur à 2 R par rapport à la structure cristalline ont été classés premier pour 68% des complexes. Lorsque les cinq meilleures solutions sont prises en compte, le taux de succès grimpe à 78%, et 92% lorsque la totalité de la dernière génération est prise en compte. La plupart des erreurs de prédiction sont imputables à la présence de contacts cristallins. Depuis, EADock a été utilisé pour comprendre les mécanismes moléculaires impliqués dans la régulation de la Na,K ATPase et dans l'activation du peroxisome proliferatoractivated receptor a (PPARa). Il a également permis de décrire l'interaction de polluants couramment rencontrés sur PPARy, ainsi que l'influence de la métabolisation de l'Imatinib (PA anticancéreux) sur la fixation à la kinase Bcr-Abl. Une approche basée sur la prédiction des interactions de fragments moléculaires avec protéine cible est également proposée. Elle a permis la découverte de nouveaux ligands peptidiques de PPARa et de l'intégrine a5ß1. Dans les deux cas, l'activité de ces nouveaux peptides est comparable à celles de ligands bien établis, comme le Wy14,643 pour le premier, et le Cilengitide (PA anticancéreux) pour la seconde.
Resumo:
BACKGROUND: Ischemic stroke is the leading cause of mortality worldwide and a major contributor to neurological disability and dementia. Terutroban is a specific TP receptor antagonist with antithrombotic, antivasoconstrictive, and antiatherosclerotic properties, which may be of interest for the secondary prevention of ischemic stroke. This article describes the rationale and design of the Prevention of cerebrovascular and cardiovascular Events of ischemic origin with teRutroban in patients with a history oF ischemic strOke or tRansient ischeMic Attack (PERFORM) Study, which aims to demonstrate the superiority of the efficacy of terutroban versus aspirin in secondary prevention of cerebrovascular and cardiovascular events. METHODS AND RESULTS: The PERFORM Study is a multicenter, randomized, double-blind, parallel-group study being carried out in 802 centers in 46 countries. The study population includes patients aged > or =55 years, having suffered an ischemic stroke (< or =3 months) or a transient ischemic attack (< or =8 days). Participants are randomly allocated to terutroban (30 mg/day) or aspirin (100 mg/day). The primary efficacy endpoint is a composite of ischemic stroke (fatal or nonfatal), myocardial infarction (fatal or nonfatal), or other vascular death (excluding hemorrhagic death of any origin). Safety is being evaluated by assessing hemorrhagic events. Follow-up is expected to last for 2-4 years. Assuming a relative risk reduction of 13%, the expected number of primary events is 2,340. To obtain statistical power of 90%, this requires inclusion of at least 18,000 patients in this event-driven trial. The first patient was randomized in February 2006. CONCLUSIONS: The PERFORM Study will explore the benefits and safety of terutroban in secondary cardiovascular prevention after a cerebral ischemic event.
Resumo:
This paper deals with the design of nonregenerativerelaying transceivers in cooperative systems where channel stateinformation (CSI) is available at the relay station. The conventionalnonregenerative approach is the amplify and forward(A&F) approach, where the signal received at the relay is simplyamplified and retransmitted. In this paper, we propose an alternativelinear transceiver design for nonregenerative relaying(including pure relaying and the cooperative transmission cases),making proper use of CSI at the relay station. Specifically, wedesign the optimum linear filtering performed on the data to beforwarded at the relay. As optimization criteria, we have consideredthe maximization of mutual information (that provides aninformation rate for which reliable communication is possible) fora given available transmission power at the relay station. Threedifferent levels of CSI can be considered at the relay station: onlyfirst hop channel information (between the source and relay);first hop channel and second hop channel (between relay anddestination) information, or a third situation where the relaymay have complete cooperative channel information includingall the links: first and second hop channels and also the directchannel between source and destination. Despite the latter beinga more unrealistic situation, since it requires the destination toinform the relay station about the direct channel, it is useful as anupper benchmark. In this paper, we consider the last two casesrelating to CSI.We compare the performance so obtained with theperformance for the conventional A&F approach, and also withthe performance of regenerative relays and direct noncooperativetransmission for two particular cases: narrowband multiple-inputmultiple-output transceivers and wideband single input singleoutput orthogonal frequency division multiplex transmissions.
Resumo:
The goal of this dissertation is to find and provide the basis for a managerial tool that allows a firm to easily express its business logic. The methodological basis for this work is design science, where the researcher builds an artifact to solve a specific problem. In this case the aim is to provide an ontology that makes it possible to explicit a firm's business model. In other words, the proposed artifact helps a firm to formally describe its value proposition, its customers, the relationship with them, the necessary intra- and inter-firm infrastructure and its profit model. Such an ontology is relevant because until now there is no model that expresses a company's global business logic from a pure business point of view. Previous models essentially take an organizational or process perspective or cover only parts of a firm's business logic. The four main pillars of the ontology, which are inspired by management science and enterprise- and processmodeling, are product, customer interface, infrastructure and finance. The ontology is validated by case studies, a panel of experts and managers. The dissertation also provides a software prototype to capture a company's business model in an information system. The last part of the thesis consists of a demonstration of the value of the ontology in business strategy and Information Systems (IS) alignment. Structure of this thesis: The dissertation is structured in nine parts: Chapter 1 presents the motivations of this research, the research methodology with which the goals shall be achieved and why this dissertation present a contribution to research. Chapter 2 investigates the origins, the term and the concept of business models. It defines what is meant by business models in this dissertation and how they are situated in the context of the firm. In addition this chapter outlines the possible uses of the business model concept. Chapter 3 gives an overview of the research done in the field of business models and enterprise ontologies. Chapter 4 introduces the major contribution of this dissertation: the business model ontology. In this part of the thesis the elements, attributes and relationships of the ontology are explained and described in detail. Chapter 5 presents a case study of the Montreux Jazz Festival which's business model was captured by applying the structure and concepts of the ontology. In fact, it gives an impression of how a business model description based on the ontology looks like. Chapter 6 shows an instantiation of the ontology into a prototype tool: the Business Model Modelling Language BM2L. This is an XML-based description language that allows to capture and describe the business model of a firm and has a large potential for further applications. Chapter 7 is about the evaluation of the business model ontology. The evaluation builds on literature review, a set of interviews with practitioners and case studies. Chapter 8 gives an outlook on possible future research and applications of the business model ontology. The main areas of interest are alignment of business and information technology IT/information systems IS and business model comparison. Finally, chapter 9 presents some conclusions.
Resumo:
The need for high performance, high precision, and energy saving in rotating machinery demands an alternative solution to traditional bearings. Because of the contactless operation principle, the rotating machines employing active magnetic bearings (AMBs) provide many advantages over the traditional ones. The advantages such as contamination-free operation, low maintenance costs, high rotational speeds, low parasitic losses, programmable stiffness and damping, and vibration insulation come at expense of high cost, and complex technical solution. All these properties make the use of AMBs appropriate primarily for specific and highly demanding applications. High performance and high precision control requires model-based control methods and accurate models of the flexible rotor. In turn, complex models lead to high-order controllers and feature considerable computational burden. Fortunately, in the last few years the advancements in signal processing devices provide new perspective on the real-time control of AMBs. The design and the real-time digital implementation of the high-order LQ controllers, which focus on fast execution times, are the subjects of this work. In particular, the control design and implementation in the field programmable gate array (FPGA) circuits are investigated. The optimal design is guided by the physical constraints of the system for selecting the optimal weighting matrices. The plant model is complemented by augmenting appropriate disturbance models. The compensation of the force-field nonlinearities is proposed for decreasing the uncertainty of the actuator. A disturbance-observer-based unbalance compensation for canceling the magnetic force vibrations or vibrations in the measured positions is presented. The theoretical studies are verified by the practical experiments utilizing a custom-built laboratory test rig. The test rig uses a prototyping control platform developed in the scope of this work. To sum up, the work makes a step in the direction of an embedded single-chip FPGA-based controller of AMBs.
Resumo:
During the last half decade the popularity of different peer-to-peer applications has grown tremendously. Traditionally only desktop-class computers with fixed line network connections have been powerful enough to utilize peer-to-peer. However, the situation is about to change. The rapid development of wireless terminals will soon enable peer-to-peer applications on these devices as well as on desktops. Possibilities are further enhanced by the upcoming high-bandwidth cellular networks. In this thesis the applicability and implementation alternatives of an existing peer-to-peer system are researched for two target platforms: Linux powered iPaq and Symbian OS based smartphone. The result is a peer-to-peer middleware component suitable for mobile terminals. It works on both platforms and utilizes Bluetooth networking technology. The implemented software platforms are compatible with each other and support for additional network technologies can be added with a minimal effort.
Resumo:
Langattomat lähiverkot ovat viime vuosikymmeninä saavuttaneet suuren suosion. Tässä työssä käsitellään käyttäjien todentamisjärjestelmän suunnittelua ja kehitystä langattomaan monioperaattoriverkkoon. Langattomassa monioperaattoriverkossa käyttäjillä on mahdollisuus käyttää eri operaattoreiden palveluita. Aluksi käsitellään olemassa olevia todentamismenetelmiä ja -järjestelmiä. minkä jälkeen kuvaillaan todentamisjärjestelmä langattomille monioperaattoriverkoille. Todentamisjärjestelmän ratkaisuvaihtoehtoja esitellään kaksi, niin sanotut moni- istunto - ja yksittäisistuntomalli. Moni-istuntomalli on normaali lähestymistapa käyttäjien todentamiseen tietokonejärjestelmissä. Siinä käyttäjän pitää tunnistautua ja todentaa itsensä jokaiselle verkon palvelulle erikseen. Yksittäisistuntomallissa pyritään parempaan luotettavuuteen ja käytettävyyteen. Siinä käyttäjä todentaa itsensä vain kerran ja voi sen jälkeen päästä useisiin palveluihin. Työn loppuosassa kuvaillaan suunnitellun järjestelmän toteutusta. Lisäksi ehdotetaan vaihtoehtoisia toteutustapoja, analysoidaan järjestelmän heikkouksia ja kerrotaan jatkokehitysmahdoillisuuksista.
Resumo:
The presence of e-portfolios in educational centres, companies and administrations has emergedstrongly during the last years by creating very different practices coming from different objectives and purposes. This situation has led researchers and practitioners to design and implement e-portfolios with little reference to previous knowledge of them; consequently, developments are disparate with many of the processes and dimensions used both in development and use being unnecessary complex. In order to minimize the inconveniences, unify these developmental processes and improve the resultsof implementation and use of e-portfolios, it seemed necessary to create a network of researchers, teachers and trainers coming from different universities and institutions of different kinds who are interested in the investigation and the practice of e-portfolios in Spain. Therefore, The Network on e-portfoliowas created in 2006, funded by the Spanish Ministry of Education and led by the UniversitatOberta de Catalunya. Besides the goals associatedwith the creation of this network and which wewanted to share with other European researchers and experts of other continents, we will also present in this paper some data concerned with the first study carried out on the use of e-portfolios in our country that shows where we are and which trends are the most important for the near future.
Resumo:
Meeting design is one of the most critical prerequisites of the success of facilitated meetings but how to achieve the success is not yet fully understood. This study presents a descriptive model of the design of technology supported meetings based on literature findings about the key factors contributing to the success of collaborative meetings, and linking these factors to the meeting design steps by exploring how facilitators consider the factors in practice in their design process. The empirical part includes a multiple-case study conducted among 12 facilitators. The case concentrates on the GSS laboratory at LUT, which has been working on facilitation and GSS for the last fifteen years. The study also includes ‘control’ cases from two comparable institutions. The results of this study highlight both the variances and commonalities among facilitators in how they design collaboration processes. The design thinking of facilitators of all levels of experience is found to be largely consistent wherefore the key design factors as well as their role across the design process can be outlined. Session goals, group composition, supporting technology, motivational aspects, physical constraints, and correct design practices were found to outline the key factors in design thinking. These factors are further categorized into three factor types of controllable, constraining, and guiding design factors, because the study findings indicate the factor type to have an effect on the factor’s importance in design. Furthermore, the order of considering these factors in the design process is outlined.
Resumo:
Optimointi on tavallinen toimenpide esimerkiksi prosessin muuttamisen tai uusimisen jälkeen. Optimoinnilla pyritään etsimään vaikkapa tiettyjen laatuominaisuuksien kannalta paras tapa ajaa prosessia tai erinäisiä prosessin osia. Tämän työn tarkoituksena oli investoinnin jälkeen optimoida neljä muuttujaa, erään runkoon menevän massan jauhatus ja määrä, märkäpuristus sekä spray –tärkin määrä, kolmen laatuominaisuuden, palstautumislujuuden, geometrisen taivutusjäykkyyden ja sileyden, suhteen. Työtä varten tehtiin viisi tehdasmittakaavaista koeajoa. Ensimmäisessä koeajossa oli tarkoitus lisätä vettä tai spray –tärkkiä kolmikerroskartongin toiseen kerrosten rajapintaan, toisessa koeajossa muutettiin, jo aiemmin mainitun runkoon menevän massan jauhatusta ja jauhinkombinaatioita. Ensimmäisessä koeajossa tutkittiin palstautumislujuuden, toisessa koeajossa muiden lujuusominaisuuksien kehittymistä. Kolmannessa koeajossa tutkittiin erään runkoon menevän massan jauhatuksen ja määrän sekä kenkäpuristimen viivapaineen muutoksen vaikutusta palstautumislujuuteen, geometriseen taivutusjäykkyyteen sekä sileyteen. Neljännessä koeajossa yritettiin toistaa edellisen koeajon paras piste ja parametreja hieman muuttamalla saada aikaan vieläkin paremmat laatuominaisuudet. Myös tässä kokeessa tutkittiin muuttujien vaikutusta palstautumislujuuteen, geometriseen taivutusjäykkyyteen ja sileyteen. Viimeisen kokeen tarkoituksena oli tutkia samaisen runkoon menevän massan vähentämisen vaikutusta palstautumislujuuteen. Erinäisistä vastoinkäymisistä johtuen, koeajoista saadut tulokset jäivät melko laihoiksi. Kokeista kävi kuitenkin ilmi, että lujuusominaisuudet eivät parantuneet, vaikka jauhatusta jatkettiin. Lujuusominaisuuksien kehittymisen kannalta turha jauhatus pystyttiin siis jättämään pois ja näin säästämään energiaa sekä säästymään pitkälle viedyn jauhatuksen mahdollisesti aiheuttamilta muilta ongelmilta. Vähemmällä jauhatuksella ominaissärmäkuorma saatiin myös pidettyä alle tehtaalla halutun tason. Puuttuvat lujuusominaisuudet täytyy saavuttaa muilla keinoin.
Resumo:
The last decade has shown that the global paper industry needs new processes and products in order to reassert its position in the industry. As the paper markets in Western Europe and North America have stabilized, the competition has tightened. Along with the development of more cost-effective processes and products, new process design methods are also required to break the old molds and create new ideas. This thesis discusses the development of a process design methodology based on simulation and optimization methods. A bi-level optimization problem and a solution procedure for it are formulated and illustrated. Computational models and simulation are used to illustrate the phenomena inside a real process and mathematical optimization is exploited to find out the best process structures and control principles for the process. Dynamic process models are used inside the bi-level optimization problem, which is assumed to be dynamic and multiobjective due to the nature of papermaking processes. The numerical experiments show that the bi-level optimization approach is useful for different kinds of problems related to process design and optimization. Here, the design methodology is applied to a constrained process area of a papermaking line. However, the same methodology is applicable to all types of industrial processes, e.g., the design of biorefiners, because the methodology is totally generalized and can be easily modified.
Resumo:
The goal of the Master’s thesis is to design a test stand for a centrifugal compressor. Different theoretical aspects of flow parameters measurements and test rigs built for the similar purposes in other research units are described in the theoretical part of the work. The process of components selection and the description of chosen components are given in the second part of the thesis. Besides measuring and control stages, the designed test stand has a closed-loop piping, an aftercooler and a surge tank. Overview and layout of the test rig is presented in the last chapter of the work.
Resumo:
The significance of services as business and human activities has increased dramatically throughout the world in the last three decades. Becoming a more and more competitive and efficient service provider while still being able to provide unique value opportunities for customers requires new knowledge and ideas. Part of this knowledge is created and utilized in daily activities in every service organization, but not all of it, and therefore an emerging phenomenon in the service context is information awareness. Terms like big data and Internet of things are not only modern buzz-words but they are also describing urgent requirements for a new type of competences and solutions. When the amount of information increases and the systems processing information become more efficient and intelligent, it is the human understanding and objectives that may get separated from the automated processes and technological innovations. This is an important challenge and the core driver for this dissertation: What kind of information is created, possessed and utilized in the service context, and even more importantly, what information exists but is not acknowledged or used? In this dissertation the focus is on the relationship between service design and service operations. Reframing this relationship refers to viewing the service system from the architectural perspective. The selected perspective allows analysing the relationship between design activities and operational activities as an information system while maintaining the tight connection to existing service research contributions and approaches. This type of an innovative approach is supported by research methodology that relies on design science theory. The methodological process supports the construction of a new design artifact based on existing theoretical knowledge, creation of new innovations and testing the design artifact components in real service contexts. The relationship between design and operations is analysed in the health care and social care service systems. The existing contributions in service research tend to abstract services and service systems as value creation, working or interactive systems. This dissertation adds an important information processing system perspective to the research. The main contribution focuses on the following argument: Only part of the service information system is automated and computerized, whereas a significant part of information processing is embedded in human activities, communication and ad-hoc reactions. The results indicate that the relationship between service design and service operations is more complex and dynamic than the existing scientific and managerial models tend to view it. Both activities create, utilize, mix and share information, making service information management a necessary but relatively unknown managerial task. On the architectural level, service system -specific elements seem to disappear, but access to more general information elements and processes can be found. While this dissertation focuses on conceptual-level design artifact construction, the results provide also very practical implications for service providers. Personal, visual and hidden activities of service, and more importantly all changes that take place in any service system have also an information dimension. Making this information dimension visual and prioritizing the processed information based on service dimensions is likely to provide new opportunities to increase activities and provide a new type of service potential for customers.
Resumo:
Environmental threats are growing nowadays, they became global issues. People around the world try to face these issues by two means: solving the current affected environs and preventing non-affected environs. This thesis describes the design, implementation, and evaluation of online water quality monitoring system in Lake Saimaa, Finland. The water quality in Lake Saimaa needs to be monitored in order to provide responsible bodies with valuable information which allows them to act fast in order to prevent any negative impact on the lake's environment. The objectives were to design a suitable system, implement the system in Lake Saimaa, and then to evaluate the applicability and reliability of such systems for this environment. The needs for the system were first isolated, and then the design, needed modifications, and the construction of the system took place. After that was the testing of the system in Lake Saimaa in two locations nearby Mikkeli city. The last step was to evaluate the whole system. The main results were that the application of online water quality monitoring systems in Lake Saimaa can benefit of many advantages such as reducing the required manpower, time and running costs. However, the point of unreliability of the exact measured values of some parameters is still the drawback of such systems which can be developed by using more advanced equipments with more sophisticated features specifically for the purpose of monitoring in the predefined location.
Resumo:
At present, one of the main concerns of green network is to minimize the power consumption of network infrastructure. Surveys show that, the highest amount of power is consumed by the network devices during its runtime. However to control this power consumption it is important to know which factors has highest impact on this matter. This paper is focused on the measurement and modeling the power consumption of an Ethernet switch during its runtime considering various types of input parameters with all possible combinations. For the experiment, three input parameters are chosen. They are bandwidth, link load and number of connections. The output to be measured is the power consumption of the Ethernet switch. Due to the uncertain power consuming pattern of the Ethernet switch a fully-comprehensive experimental evaluation would require an unfeasible and cumbersome experimental phase. Because of that, design of experiment (DoE) method has been applied to obtain adequate information on the effects of each input parameters on the power consumption. The whole work consists of three parts. In the first part a test bed is planned with input parameters and the power consumption of the switch is measured. The second part is about generating a mathematical model with the help of design of experiment tools. This model can be used for measuring precise power consumption in different scenario and also pinpoint the parameters with higher influence in power consumption. And in the last part, the mathematical model is evaluated by comparing with the experimental values.