974 resultados para Open-Design


Relevância:

30.00% 30.00%

Publicador:

Resumo:

Since the first implantation of an endograft in 1991, endovascular aneurysm repair (EVAR) rapidly gained recognition. Historical trials showed lower early mortality rates but these results were not maintained beyond 4 years. Despite newer-generation devices, higher rates of reintervention are associated with EVAR during follow-up. Therefore, the best therapeutic decision relies on many parameters that the physician has to take in consideration. Patient's preferences and characteristics are important, especially age and life expectancy besides health status. Aneurysmal anatomical conditions remain probably the most predictive factor that should be carefully evaluated to offer the best treatment. Unfavorable anatomy has been observed to be associated with more complications especially endoleak, leading to more re-interventions and higher risk of late mortality. Nevertheless, technological advances have made surgeons move forward beyond the set barriers. Thus, more endografts are implanted outside the instructions for use despite excellent results after open repair especially in low-risk patients. When debating about AAA repair, some other crucial points should be analysed. It has been shown that strict surveillance is mandatory after EVAR to offer durable results and prevent late rupture. Such program is associated with additional costs and with increased risk of radiation. Moreover, a risk of loss of renal function exists when repetitive imaging and secondary procedures are required. The aim of this article is to review the data associated with abdominal aortic aneurysm and its treatment in order to establish selection criteria to decide between open or endovascular repair.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Background: In ∼5% of advanced NSCLC tumours, ALK tyrosine kinase is constitutively activated after translocation of ALK. ALK+ NSCLC was shown to be highly sensitive to the first approved ALK inhibitor, crizotinib. However, all pts eventually relapse on crizotinib mainly due to secondary ALK mutations/amplification or CNS metastases. Alectinib is a highly selective, potent, oral next-generation ALK inhibitor. Clinical phase II alectinib data in 46 crizotinib-naïve pts with ALK+ NSCLC reported an objective response rate (ORR) of 93.5% and a 1-year progression-free rate of 83% (95% CI: 68-92) (Inoue et al. J Thorac Oncol 2013). CNS activity was seen: of 14 pts with baseline brain metastasis, 11 had prior CNS radiation, 9 of these experienced CNS and systemic PFS of >12 months; of the 3 pts without prior CNS radiation, 2 were >15 months progression free. Trial design: Randomised, multicentre, phase III, open-label study in pts with treatment-naïve ALK+ advanced, recurrent, or metastatic NSCLC. All pts must provide pretreatment tumour tissue to confirm ALK rearrangement (by IHC). Pts (∼286 from ∼180 centres, ∼30 countries worldwide) will be randomised to alectinib (600mg oral bid, with food) or crizotinib (250mg oral bid, with/without food) until disease progression (PD), unacceptable toxicity, withdrawal of consent, or death. Stratification factors are: ECOG PS (0/1 vs 2), race (Asian vs non-Asian), baseline CNS metastases (yes vs no). Primary endpoint: PFS by investigators (RECIST v1.1). Secondary endpoints: PFS by Independent Review Committee (IRC); ORR; duration of response; OS; safety; pharmacokinetics; quality of life. Additionally, time to CNS progression will be evaluated (MRI) for the first time in a prospective randomised NSCLC trial as a secondary endpoint. Pts with isolated asymptomatic CNS progression will be allowed to continue treatment beyond documented progression until systemic PD and/or symptomatic CNS progression, according to investigator opinion. Time to CNS progression will be retrospectively assessed by the IRC using two separate criteria, RECIST and RANO. Further details: ClinicalTrials.gov (NCT02075840). Disclosure: T.S.K. Mok: Advisory boards: AZ, Roche, Eli Lilly, Merck Serono, Eisai, BMS, AVEO, Pfizer, Taiho, Boehringer Ingelheim, Novartis, GSK Biologicals, Clovis Oncology, Amgen, Janssen, BioMarin; board of directors: IASLC; corporate sponsored research: AZ; M. Perol: Advisory boards: Roche; S.I. Ou: Consulting: Pfizer, Chugai, Genentech Speaker Bureau: Pfizer, Genentech, Boehringer Ingelheim; I. Bara: Employee: F. Hoffmann-La Roche Ltd; V. Henschel: Employee and stock: F. Hoffmann-La Roche Ltd.; D.R. Camidge: Honoraria: Roche/Genentech. All other authors have declared no conflicts of interest.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Identification of chemical compounds with specific biological activities is an important step in both chemical biology and drug discovery. When the structure of the intended target is available, one approach is to use molecular docking programs to assess the chemical complementarity of small molecules with the target; such calculations provide a qualitative measure of affinity that can be used in virtual screening (VS) to rank order a list of compounds according to their potential to be active. rDock is a molecular docking program developed at Vernalis for high-throughput VS (HTVS) applications. Evolved from RiboDock, the program can be used against proteins and nucleic acids, is designed to be computationally very efficient and allows the user to incorporate additional constraints and information as a bias to guide docking. This article provides an overview of the program structure and features and compares rDock to two reference programs, AutoDock Vina (open source) and Schrodinger's Glide (commercial). In terms of computational speed for VS, rDock is faster than Vina and comparable to Glide. For binding mode prediction, rDock and Vina are superior to Glide. The VS performance of rDock is significantly better than Vina, but inferior to Glide for most systems unless pharmacophore constraints are used; in that case rDock and Glide are of equal performance. The program is released under the Lesser General Public License and is freely available for download, together with the manuals, example files and the complete test sets, at http://rdock.sourceforge.net/

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Network virtualisation is considerably gaining attentionas a solution to ossification of the Internet. However, thesuccess of network virtualisation will depend in part on how efficientlythe virtual networks utilise substrate network resources.In this paper, we propose a machine learning-based approachto virtual network resource management. We propose to modelthe substrate network as a decentralised system and introducea learning algorithm in each substrate node and substrate link,providing self-organization capabilities. We propose a multiagentlearning algorithm that carries out the substrate network resourcemanagement in a coordinated and decentralised way. The taskof these agents is to use evaluative feedback to learn an optimalpolicy so as to dynamically allocate network resources to virtualnodes and links. The agents ensure that while the virtual networkshave the resources they need at any given time, only the requiredresources are reserved for this purpose. Simulations show thatour dynamic approach significantly improves the virtual networkacceptance ratio and the maximum number of accepted virtualnetwork requests at any time while ensuring that virtual networkquality of service requirements such as packet drop rate andvirtual link delay are not affected.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The possibilities and expansion of the use of Web 2.0 has opened up a world of possibilities in online learning. In spite of the integration of these tools in education major changes are required in the educational design of instructional processes.This paper presents an educational experience conducted by the Open University of Catalonia using the social network Facebook for the purpose of testing a learning model that uses a participation and collaboration methodology among users based on the use of open educational resources.- The aim of the experience is to test an Open Social Learning (OSL) model, understood to be a virtual learning environment open to the Internet community, based on the use of open resources and on a methodology focused on the participation and collaboration of users in the construction of knowledge.- The topic chosen for this experience in Facebook was 2.0 Journeys: online tools and resources. The objective of this 5 weeks course was to provide students with resources for managing the various textual, photographic, audiovisual and multimedia materials resulting from a journey.- The most important changes in the design and development of a course based on OSL are the role of the teacher, the role of the student, the type of content and the methodology:- The teacher mixes with the participants, guiding them and offering the benefit of his/her experience and knowledge.- Students learn through their participation and collaboration with a mixed group of users.- The content is open and editable under different types of license that specify the level of accessibility.- The methodology of the course was based on the creation of a learning community able to self-manage its learning process. For this a facilitator was needed and also a central activity was established for people to participate and contribute in the community.- We used an ethnographic methodology and also questionnaires to students in order to acquire results regarding the quality of this type of learning experience.- Some of the data obtained raised questions to consider for future designs of educational situations based on OSL:- Difficulties in breaking the facilitator-centred structure- Change in the time required to adapt to the system and to achieve the objectives- Lack of commitment with free courses- The trend to return to traditional ways of learning- Accreditation- This experience has taught all of us that education can happen any time and in any place but not in any way.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Ground-based gamma-ray astronomy has had a major breakthrough with the impressive results obtained using systems of imaging atmospheric Cherenkov telescopes. Ground-based gamma-ray astronomy has a huge potential in astrophysics, particle physics and cosmology. CTA is an international initiative to build the next generation instrument, with a factor of 5-10 improvement in sensitivity in the 100 GeV-10 TeV range and the extension to energies well below 100 GeV and above 100 TeV. CTA will consist of two arrays (one in the north, one in the south) for full sky coverage and will be operated as open observatory. The design of CTA is based on currently available technology. This document reports on the status and presents the major design concepts of CTA.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Ground-based gamma-ray astronomy has had a major breakthrough with the impressive results obtained using systems of imaging atmospheric Cherenkov telescopes. Ground-based gamma-ray astronomy has a huge potential in astrophysics, particle physics and cosmology. CTA is an international initiative to build the next generation instrument, with a factor of 5-10 improvement in sensitivity in the 100 GeV-10 TeV range and the extension to energies well below 100 GeV and above 100 TeV. CTA will consist of two arrays (one in the north, one in the south) for full sky coverage and will be operated as open observatory. The design of CTA is based on currently available technology. This document reports on the status and presents the major design concepts of CTA.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Ground-based gamma-ray astronomy has had a major breakthrough with the impressive results obtained using systems of imaging atmospheric Cherenkov telescopes. Ground-based gamma-ray astronomy has a huge potential in astrophysics, particle physics and cosmology. CTA is an international initiative to build the next generation instrument, with a factor of 5-10 improvement in sensitivity in the 100 GeV-10 TeV range and the extension to energies well below 100 GeV and above 100 TeV. CTA will consist of two arrays (one in the north, one in the south) for full sky coverage and will be operated as open observatory. The design of CTA is based on currently available technology. This document reports on the status and presents the major design concepts of CTA.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Concentrated winding permanent magnet machines and their electromagnetic properties are studied in this doctoral thesis. The thesis includes a number of main tasks related to the application of permanent magnets in concentrated winding open slot machines. Suitable analytical methods are required for the first design calculations of a new machine. Concentrated winding machines differ from conventional integral slot winding machines in such a way that adapted analytical calculation methods are needed. A simple analytical model for calculating the concentrated winding axial flux machines is provided. The next three main design tasks are discussed in more detail in the thesis. The magnetic length of the rotor surface magnet machines is studied, and it is shown that the traditional methods have to be modified also in this respect. An important topic in this study has been to evaluate and minimize the rotor permanent magnet Joule losses by using segmented magnets in the calculations and experiments. Determination of the magnetizing and leakage inductances for a concentrated winding machine and the torque production capability of concentrated winding machines with different pole pair numbers are studied, and the results are compared with the corresponding properties of integral slot winding machines. The thesis introduces a new practical permanent magnet motor type for industrial use. The special features of the machine are based on the option of using concentrated winding open slot constructions of permanent magnet synchronous machines in the normal speed ranges of industrial motors, for instance up to 3000 min-1, without excessive rotor losses. By applying the analytical equations and methods introduced in the thesis, a 37 kW 2400 min-1 12-slot 10-pole axial flux machine with rotor-surfacemounted magnets is designed. The performance of the designed motor is determined by experimental measurements and finite element calculations.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Breast cancer is the most prevalent neoplasm among women in the majority of countries worldwide. Breast cancer treatment include mastectomy which is associated to strong impact in women. Breast reconstruction is an option for many women to re-establish their body image and also to decrease psychological impact. However, breast reconstruction rates are low and many factors are involved in not undergoing breast reconstruction. Patient involvement in the decision-making process increases breast reconstruction rates and is associated to higher satisfaction and less anxiety and depression symptoms. More physician-patient relation and more education in terms of breast reconstruction are needed to achieve our objective. A new approach of medical care, called Patson Approach, is created in order to meet our goal with more patient involvement, as well as, physician and psychological counsellingObjective: to increase breast reconstruction rates in women who are candidates for breast reconstruction after mastectomy and are included in the Patson Approach compared to women included in the Standard ApproachMethods: the study design will be a randomized, controlled, open-label clinical trial. 62 patients will be recruited during two years and randomly divided in two groups, 31 will be included in the Standard Approach and 31 will be included in the Patson Approach. Preoperative and postoperative appointments are established in order to do a follow-up of the patients and collect all the data

Relevância:

30.00% 30.00%

Publicador:

Resumo:

We live in an era defined by a wealth of open and readily available information, and the accelerated evolution of social, mobile and creative technologies. The provision of knowledge, once a primary role of educators, is now devolved to an immense web of free and readily accessible sources. Consequently, educators need to redefine their role not just ¿from sage on the stage to guide on the side¿ but, as more and more voices insist, as ¿designers for learning¿.The call for such a repositioning of educators is heard from leaders in the field of technology-enhanced learning (TEL) and resonates well with the growing culture of design-based research in Education. However, it is still struggling to find a foothold in educational practice. We contend that the root causes of this discrepancy are the lack of articulation of design practices and methods, along with a shortage of tools and representations to support such practices, a lack of a culture of teacher-as-designer among practitioners, and insufficient theoretical development.The Art and Science of Learning Design (ASLD) explores the frameworks, methods, and tools available for teachers, technologists and researchers interested in designing for learning Learning Design theories arising from findings of research are explored, drawing upon research and practitioner experiences. It then surveys current trends in the practices, methods, and methodologies of Learning Design. Highlighting the translation of theory into practice, this book showcases some of the latest tools that support the learning design process itself.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The development of software tools begun as the first computers were built. The current generation of development environments offers a common interface to access multiple software tools and often also provide a possibility to build custom tools as extensions to the existing development environment. Eclipse is an open source development environment that offers good starting point for developing custom extensions. This thesis presents a software tool to aid the development of context-aware applications on Multi-User Publishing Environment (MUPE) platform. The tool is implemented as an Eclipse plug-in. The tool allows developer to include external server side contexts to their MUPE applications. The tool allows additional context sources to be added through the Eclipse's extension point mechanism. The thesis describes how the tool was designed and implemented. The implementation consists of tool core component part and an additional context source extension part. Tool core component is responsible for the actual context addition and also provides the needed user interface elements to the Eclipse workbench. Context source component provides the needed context source related information to the core component. As part of the work an update site feature was also implemented for distributing the tool through Eclipse update mechanism.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

In order to grow, cities are increasingly competing for attention, jobs, investments, visitors, residents and significant events. Cities need to come up with creative solutions to keep up with the competition; they ought to become creative cities. Attracting talented and diverse inhabitants is a key factor in developing a creative city, which on is characterized by openness, tolerance, vibrancy and diversity. Along the need for renewed city images city brand building has become popular. Helsinki is the World Design Capital 2012 (WDC 2012) and this mega-event presents a meaningful opportunity for the city to broadcast itself globally. The purpose of this study is to evaluate how Helsinki brands itself as a creative city through an international mega-event. The sub-aims are to: 1) Map the factors behind the creative city and their relation to the city of Helsinki, 2) Describe the city branding process, 3) Evaluate the role of the Helsinki World Design Capital 2012 mega-event in Helsinki’s creative city brand building. First, the theory discusses the concept of the creative city that has gained growing attention during the past decade. Then, the city branding process is described and the benefits of hosting a mega-event are presented. Finally, co-branding a city and a mega-event in order to generate maximum benefit from the mega-event, is reviewed. This is a qualitative research for which data was collected through three face-to-face interviews, the World Design Capital 2012 bid, Helsinki’s economic development strategy, a consulting firm’s research report on the case city and web-pages. The research reveals that Helsinki has shown interest in the creative city discussion. The terminology around the concept is however approached carefully. Helsinki fits many of the creative city characteristics and recognizes its flaws for which improvement strategies have been planned. Bottlenecks keeping the city from promoting a more open mind were mainly revealed in its organizational structures. Helsinki has no official brand strategy; nonetheless pressure to develop one is present. The World Design Capital 2012 mega-event is seen as a meaningful stepping board to strengthen Helsinki’s identity and image, and start thinking about a city brand. The brand strategies of the mega-event support the values and virtues of the city itself, which enables benefits of co-branding introduces in the theory part. Helsinki has no official brand and doesn’t call itself a creative city, however this study shows signs of the city taking steps towards building a creative city brand with the help of the Helsinki World Design Capital 2012 mega-event.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This work presents recent results concerning a design methodology used to estimate the positioning deviation for a gantry (Cartesian) manipulator, related mainly to structural elastic deformation of components during operational conditions. The case-study manipulator is classified as gantry type and its basic dimensions are 1,53m x 0,97m x 1,38m. The dimensions used for the calculation of effective workspace due to end-effector path displacement are: 1m x 0,5m x 0,5m. The manipulator is composed by four basic modules defined as module X, module Y, module Z and terminal arm, where is connected the end-effector. Each module controlled axis performs a linear-parabolic positioning movement. The planning path algorithm has the maximum velocity and the total distance as input parameters for a given task. The acceleration and deceleration times are the same. Denavit-Hartemberg parameterization method is used in the manipulator kinematics model. The gantry manipulator can be modeled as four rigid bodies with three degrees-of-freedom in translational movements, connected as an open kinematics chain. Dynamic analysis were performed considering inertial parameters specification such as component mass, inertia and center of gravity position of each module. These parameters are essential for a correct manipulator dynamic modelling, due to multiple possibilities of motion and manipulation of objects with different masses. The dynamic analysis consists of a mathematical modelling of the static and dynamic interactions among the modules. The computation of the structural deformations uses the finite element method (FEM).

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This study is a qualitative action research by its nature with elements of personal design in the form of a tangible model implementation framework construction. Utilized empirical data has been gathered via two questionnaires in relation to the arranged four workshop events with twelve individual participants. Five of them represented maintenance customers, three maintenance service providers and four equipment providers respectively. Further, there are two main research objectives in proportion to the two complementary focusing areas of this thesis. Firstly, the value-based life-cycle model, which first version has already been developed prior to this thesis, requires updating in order to increase its real-life applicability as an inter-firm decision-making tool in industrial maintenance. This first research objective is fulfilled by improving appearance, intelligibility and usability of the above-mentioned model. In addition, certain new features are also added. The workshop participants from the collaborating companies were reasonably pleased with made changes, although further attention will be required in future on the model’s intelligibility in particular as main results, charts and values were all reckoned as slightly hard to understand. Moreover, upgraded model’s appearance and added new features satisfied them the most. Secondly and more importantly, the premises of the model’s possible inter-firm implementation process need to be considered. This second research objective is delivered in two consecutive steps. At first, a bipartite open-books supported implementation framework is created and its different characteristics discussed in theory. Afterwards, the prerequisites and the pitfalls of increasing inter-organizational information transparency are studied in empirical context. One of the main findings was that the organizations are not yet prepared for network-wide information disclosure as dyadic collaboration was favored instead. However, they would be willing to share information bilaterally at least. Another major result was that the present state of companies’ cost accounting systems will definitely need implementation-wise enhancing in future since accurate and sufficiently detailed maintenance data is not available. Further, it will also be crucial to create supporting and mutually agreed network infrastructure. There are hardly any collaborative models, methods or tools currently in usage. Lastly, the essential questions about mutual trust and predominant purchasing strategies are cooperation-wise important. If inter-organizational activities are expanded, a more relational approach should be favored in this regard. Mutual trust was also recognized as a significant cooperation factor, but it is hard to measure in reality.