878 resultados para computer supported collaborative work


Relevância:

30.00% 30.00%

Publicador:

Resumo:

In this work is presented and tested (for 106 adducts, mainly of the zinc group halides) two empirical equations supported in TG data to estimate the value of the metal-ligand bond dissociation enthalpy for adducts: <D> (M-O) = t i / g if t i < 420 K and <D> (M-O) = (t i / g ) - 7,75 . 10-2 . t i if t i > 420 K. In this empirical equations, t i is the thermodynamic temperature of the beginning of the thermal decomposition of the adduct, as determined by thermogravimetry, andg is a constant factor that is function of the metal halide considered and of the number of ligands, but is not dependant of the ligand itself. To half of the tested adducts the difference between experimental and calculated values was less than 5%. To about 80% of the tested adducts, the difference between the experimental (calorimetric) and the calculated (using the proposed equations) values are less than 15%.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

En los tiempos que corren la robótica forma uno de los pilares más importantes en la industria y una gran noticia para los ingenieros es la referente a las ventas de estos, ya que en 2013, unos 179.000 robots industriales se vendieron en todo el mundo, de nuevo un máximo histórico y un 12% más que en 2012 según datos de la IFR (International Federation of Robotics). Junto a esta noticia, la robótica colaborativa entra en juego en el momento que los robots y los seres humanos deben compartir el lugar de trabajo sin que nos veamos excluidos por las maquinas, por lo tanto lo que se intenta es que los robots mejoren la calidad del trabajo al hacerse cargo de los trabajos peligrosos, tediosos y sucios que no son posibles o seguros para los seres humanos. Otro concepto muy importante y directamente relacionado con lo anterior que está muy en boga y se escucha desde hace relativamente poco tiempo es el de la fabrica del futuro o “Factory Of The Future” la cual intenta que los operarios y los robots encuentren la sintonía en el entorno laboral y que los robots se consideren como maquinaria colaborativa y no como sustitutiva, considerándose como uno de los grandes nichos productivos en plena expansión. Dejando a un lado estos conceptos técnicos que nunca debemos olvidar si nuestra carrera profesional va enfocada en este ámbito industrial, el tema central de este proyecto está basado, como no podía ser de otro modo, en la robótica, que junto con la visión artificial, el resultado de esta fusión, ha dado un manipulador robótico al que se le ha dotado de cierta “inteligencia”. Se ha planteado un sencillo pero posible proceso de producción el cual es capaz de almacenar piezas de diferente forma y color de una forma autónoma solamente guiado por la imagen capturada con una webcam integrada en el equipo. El sistema consiste en una estructura soporte delimitada por una zona de trabajo en la cual se superponen unas piezas diseñadas al efecto las cuales deben ser almacenadas en su lugar correspondiente por el manipulador robótico. Dicho manipulador de cinemática paralela está basado en la tecnología de cables, comandado por cuatro motores que le dan tres grados de libertad (±X, ±Y, ±Z) donde el efector se encuentra suspendido sobre la zona de trabajo moviéndose de forma que es capaz de identificar las características de las piezas en situación, color y forma para ser almacenadas de una forma ordenada según unas premisas iníciales.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The purpose of this thesis is to develop an environment or network that enables effective collaborative product structure management among stakeholders in each unit, throughout the entire product lifecycle and product data management. This thesis uses framework models as an approach to the problem. Framework model methods for development of collaborative product structure management are proposed in this study, there are three unique models depicted to support collaborative product structure management: organization model, process model and product model. In the organization model, the formation of product data management system (eDSTAT) key user network is specified. In the process model, development is based on the case company’s product development matrix. In the product model framework, product model management, product knowledge management and design knowledge management are defined as development tools and collaboration is based on web-based product structure management. Collaborative management is executed using all these approaches. A case study from an actual project at the case company is presented as an implementation; this is to verify the models’ applicability. A computer assisted design tool and the web-based product structure manager, have been used as tools of this collaboration with the support of the key user. The current PDM system, eDSTAT, is used as a piloting case for key user role. The result of this development is that the role of key user as a collaboration channel is defined and established. The key user is able to provide one on one support for the elevator projects. Also the management activities are improved through the application of process workflow by following criteria for each project milestone. The development shows effectiveness of product structure management in product lifecycle, improved production process by eliminating barriers (e.g. improvement of two-way communication) during design phase and production phase. The key user role is applicable on a global scale in the company.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Virtual screening is a central technique in drug discovery today. Millions of molecules can be tested in silico with the aim to only select the most promising and test them experimentally. The topic of this thesis is ligand-based virtual screening tools which take existing active molecules as starting point for finding new drug candidates. One goal of this thesis was to build a model that gives the probability that two molecules are biologically similar as function of one or more chemical similarity scores. Another important goal was to evaluate how well different ligand-based virtual screening tools are able to distinguish active molecules from inactives. One more criterion set for the virtual screening tools was their applicability in scaffold-hopping, i.e. finding new active chemotypes. In the first part of the work, a link was defined between the abstract chemical similarity score given by a screening tool and the probability that the two molecules are biologically similar. These results help to decide objectively which virtual screening hits to test experimentally. The work also resulted in a new type of data fusion method when using two or more tools. In the second part, five ligand-based virtual screening tools were evaluated and their performance was found to be generally poor. Three reasons for this were proposed: false negatives in the benchmark sets, active molecules that do not share the binding mode, and activity cliffs. In the third part of the study, a novel visualization and quantification method is presented for evaluation of the scaffold-hopping ability of virtual screening tools.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Dagens programvaruindustri står inför alltmer komplicerade utmaningar i en värld där programvara är nästan allstädes närvarande i våra dagliga liv. Konsumenten vill ha produkter som är pålitliga, innovativa och rika i funktionalitet, men samtidigt också förmånliga. Utmaningen för oss inom IT-industrin är att skapa mer komplexa, innovativa lösningar till en lägre kostnad. Detta är en av orsakerna till att processförbättring som forskningsområde inte har minskat i betydelse. IT-proffs ställer sig frågan: “Hur håller vi våra löften till våra kunder, samtidigt som vi minimerar vår risk och ökar vår kvalitet och produktivitet?” Inom processförbättringsområdet finns det olika tillvägagångssätt. Traditionella processförbättringsmetoder för programvara som CMMI och SPICE fokuserar på kvalitets- och riskaspekten hos förbättringsprocessen. Mer lättviktiga metoder som t.ex. lättrörliga metoder (agile methods) och Lean-metoder fokuserar på att hålla löften och förbättra produktiviteten genom att minimera slöseri inom utvecklingsprocessen. Forskningen som presenteras i denna avhandling utfördes med ett specifikt mål framför ögonen: att förbättra kostnadseffektiviteten i arbetsmetoderna utan att kompromissa med kvaliteten. Den utmaningen attackerades från tre olika vinklar. För det första förbättras arbetsmetoderna genom att man introducerar lättrörliga metoder. För det andra bibehålls kvaliteten genom att man använder mätmetoder på produktnivå. För det tredje förbättras kunskapsspridningen inom stora företag genom metoder som sätter samarbete i centrum. Rörelsen bakom lättrörliga arbetsmetoder växte fram under 90-talet som en reaktion på de orealistiska krav som den tidigare förhärskande vattenfallsmetoden ställde på IT-branschen. Programutveckling är en kreativ process och skiljer sig från annan industri i det att den största delen av det dagliga arbetet går ut på att skapa något nytt som inte har funnits tidigare. Varje programutvecklare måste vara expert på sitt område och använder en stor del av sin arbetsdag till att skapa lösningar på problem som hon aldrig tidigare har löst. Trots att detta har varit ett välkänt faktum redan i många decennier, styrs ändå många programvaruprojekt som om de vore produktionslinjer i fabriker. Ett av målen för rörelsen bakom lättrörliga metoder är att lyfta fram just denna diskrepans mellan programutvecklingens innersta natur och sättet på vilket programvaruprojekt styrs. Lättrörliga arbetsmetoder har visat sig fungera väl i de sammanhang de skapades för, dvs. små, samlokaliserade team som jobbar i nära samarbete med en engagerad kund. I andra sammanhang, och speciellt i stora, geografiskt utspridda företag, är det mera utmanande att införa lättrörliga metoder. Vi har nalkats utmaningen genom att införa lättrörliga metoder med hjälp av pilotprojekt. Detta har två klara fördelar. För det första kan man inkrementellt samla kunskap om metoderna och deras samverkan med sammanhanget i fråga. På så sätt kan man lättare utveckla och anpassa metoderna till de specifika krav som sammanhanget ställer. För det andra kan man lättare överbrygga motstånd mot förändring genom att introducera kulturella förändringar varsamt och genom att målgruppen får direkt förstahandskontakt med de nya metoderna. Relevanta mätmetoder för produkter kan hjälpa programvaruutvecklingsteam att förbättra sina arbetsmetoder. När det gäller team som jobbar med lättrörliga och Lean-metoder kan en bra uppsättning mätmetoder vara avgörande för beslutsfattandet när man prioriterar listan över uppgifter som ska göras. Vårt fokus har legat på att stöda lättrörliga och Lean-team med interna produktmätmetoder för beslutsstöd gällande så kallad omfaktorering, dvs. kontinuerlig kvalitetsförbättring av programmets kod och design. Det kan vara svårt att ta ett beslut att omfaktorera, speciellt för lättrörliga och Lean-team, eftersom de förväntas kunna rättfärdiga sina prioriteter i termer av affärsvärde. Vi föreslår ett sätt att mäta designkvaliteten hos system som har utvecklats med hjälp av det så kallade modelldrivna paradigmet. Vi konstruerar även ett sätt att integrera denna mätmetod i lättrörliga och Lean-arbetsmetoder. En viktig del av alla processförbättringsinitiativ är att sprida kunskap om den nya programvaruprocessen. Detta gäller oavsett hurdan process man försöker introducera – vare sig processen är plandriven eller lättrörlig. Vi föreslår att metoder som baserar sig på samarbete när processen skapas och vidareutvecklas är ett bra sätt att stöda kunskapsspridning på. Vi ger en översikt över författarverktyg för processer på marknaden med det förslaget i åtanke.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Leadership is essential for the effectiveness of the teams and organizations they are part of. The challenges facing organizations today require an exhaustive review of the strategic role of leadership. In this context, it is necessary to explore new types of leadership capable of providing an effective response to new needs. The presentday situations, characterized by complexity and ambiguity, make it difficult for an external leader to perform all leadership functions successfully. Likewise, knowledge-based work requires providing professional groups with sufficient autonomy to perform leadership functions. This study focuses on shared leadership in the team context. Shared leadership is seen as an emergent team property resulting from the distribution of leadership influence across multiple team members. Shared leadership entails sharing power and influence broadly among the team members rather than centralizing it in the hands of a single individual who acts in the clear role of a leader. By identifying the team itself as a key source of influence, this study points to the relational nature of leadership as a social construct where leadership is seen as social process of relating processes that are co-constructed by several team members. Based on recent theoretical developments concerned with relational, practice-based and constructionist approaches to the study of leadership processes, this thesis proposes the study of leadership interactions, working processes and practices to focus on the construction of direction, alignment and commitment. During the research process, critical events, activities, working processes and practices of a case team have been examined and analyzed with the grounded theory –approach in the terms of shared leadership. There are a variety of components to this complex process and a multitude of factors that may influence the development of shared leadership. The study suggests that the development process of shared leadership is a common sense -making process and consists of four overlapping dimensions (individual, social, structural, and developmental) to work with as a team. For shared leadership to emerge, the members of the team must offer leadership services, and the team as a whole must be willing to rely on leadership by multiple team members. For these individual and collective behaviors to occur, the team members must believe that offering influence to and accepting it from fellow team members are welcome and constructive actions. Leadership emerges when people with differing world views use dialogue and collaborative learning to create spaces where a shared common purpose can be achieved while a diversity of perspectives is preserved and valued. This study also suggests that this process can be supported by different kinds of meaning-making and process tools. Leadership, then, does not reside in a person or in a role, but in the social system. The built framework integrates the different dimensions of shared leadership and describes their relationships. This way, the findings of this study can be seen as a contribution to the understanding of what constitutes essential aspects of shared leadership in the team context that can be of theoretical value in terms of advancing the adoption and development process of shared leadership. In the real world, teams and organizations can create conditions to foster and facilitate the process. We should encourage leaders and team members to approach leadership as a collective effort that the team can be prepared for, so that the response is rapid and efficient.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The focus of the present work was on 10- to 12-year-old elementary school students’ conceptual learning outcomes in science in two specific inquiry-learning environments, laboratory and simulation. The main aim was to examine if it would be more beneficial to combine than contrast simulation and laboratory activities in science teaching. It was argued that the status quo where laboratories and simulations are seen as alternative or competing methods in science teaching is hardly an optimal solution to promote students’ learning and understanding in various science domains. It was hypothesized that it would make more sense and be more productive to combine laboratories and simulations. Several explanations and examples were provided to back up the hypothesis. In order to test whether learning with the combination of laboratory and simulation activities can result in better conceptual understanding in science than learning with laboratory or simulation activities alone, two experiments were conducted in the domain of electricity. In these experiments students constructed and studied electrical circuits in three different learning environments: laboratory (real circuits), simulation (virtual circuits), and simulation-laboratory combination (real and virtual circuits were used simultaneously). In order to measure and compare how these environments affected students’ conceptual understanding of circuits, a subject knowledge assessment questionnaire was administered before and after the experimentation. The results of the experiments were presented in four empirical studies. Three of the studies focused on learning outcomes between the conditions and one on learning processes. Study I analyzed learning outcomes from experiment I. The aim of the study was to investigate if it would be more beneficial to combine simulation and laboratory activities than to use them separately in teaching the concepts of simple electricity. Matched-trios were created based on the pre-test results of 66 elementary school students and divided randomly into a laboratory (real circuits), simulation (virtual circuits) and simulation-laboratory combination (real and virtual circuits simultaneously) conditions. In each condition students had 90 minutes to construct and study various circuits. The results showed that studying electrical circuits in the simulation–laboratory combination environment improved students’ conceptual understanding more than studying circuits in simulation and laboratory environments alone. Although there were no statistical differences between simulation and laboratory environments, the learning effect was more pronounced in the simulation condition where the students made clear progress during the intervention, whereas in the laboratory condition students’ conceptual understanding remained at an elementary level after the intervention. Study II analyzed learning outcomes from experiment II. The aim of the study was to investigate if and how learning outcomes in simulation and simulation-laboratory combination environments are mediated by implicit (only procedural guidance) and explicit (more structure and guidance for the discovery process) instruction in the context of simple DC circuits. Matched-quartets were created based on the pre-test results of 50 elementary school students and divided randomly into a simulation implicit (SI), simulation explicit (SE), combination implicit (CI) and combination explicit (CE) conditions. The results showed that when the students were working with the simulation alone, they were able to gain significantly greater amount of subject knowledge when they received metacognitive support (explicit instruction; SE) for the discovery process than when they received only procedural guidance (implicit instruction: SI). However, this additional scaffolding was not enough to reach the level of the students in the combination environment (CI and CE). A surprising finding in Study II was that instructional support had a different effect in the combination environment than in the simulation environment. In the combination environment explicit instruction (CE) did not seem to elicit much additional gain for students’ understanding of electric circuits compared to implicit instruction (CI). Instead, explicit instruction slowed down the inquiry process substantially in the combination environment. Study III analyzed from video data learning processes of those 50 students that participated in experiment II (cf. Study II above). The focus was on three specific learning processes: cognitive conflicts, self-explanations, and analogical encodings. The aim of the study was to find out possible explanations for the success of the combination condition in Experiments I and II. The video data provided clear evidence about the benefits of studying with the real and virtual circuits simultaneously (the combination conditions). Mostly the representations complemented each other, that is, one representation helped students to interpret and understand the outcomes they received from the other representation. However, there were also instances in which analogical encoding took place, that is, situations in which the slightly discrepant results between the representations ‘forced’ students to focus on those features that could be generalised across the two representations. No statistical differences were found in the amount of experienced cognitive conflicts and self-explanations between simulation and combination conditions, though in self-explanations there was a nascent trend in favour of the combination. There was also a clear tendency suggesting that explicit guidance increased the amount of self-explanations. Overall, the amount of cognitive conflicts and self-explanations was very low. The aim of the Study IV was twofold: the main aim was to provide an aggregated overview of the learning outcomes of experiments I and II; the secondary aim was to explore the relationship between the learning environments and students’ prior domain knowledge (low and high) in the experiments. Aggregated results of experiments I & II showed that on average, 91% of the students in the combination environment scored above the average of the laboratory environment, and 76% of them scored also above the average of the simulation environment. Seventy percent of the students in the simulation environment scored above the average of the laboratory environment. The results further showed that overall students seemed to benefit from combining simulations and laboratories regardless of their level of prior knowledge, that is, students with either low or high prior knowledge who studied circuits in the combination environment outperformed their counterparts who studied in the laboratory or simulation environment alone. The effect seemed to be slightly bigger among the students with low prior knowledge. However, more detailed inspection of the results showed that there were considerable differences between the experiments regarding how students with low and high prior knowledge benefitted from the combination: in Experiment I, especially students with low prior knowledge benefitted from the combination as compared to those students that used only the simulation, whereas in Experiment II, only students with high prior knowledge seemed to benefit from the combination relative to the simulation group. Regarding the differences between simulation and laboratory groups, the benefits of using a simulation seemed to be slightly higher among students with high prior knowledge. The results of the four empirical studies support the hypothesis concerning the benefits of using simulation along with laboratory activities to promote students’ conceptual understanding of electricity. It can be concluded that when teaching students about electricity, the students can gain better understanding when they have an opportunity to use the simulation and the real circuits in parallel than if they have only the real circuits or only a computer simulation available, even when the use of the simulation is supported with the explicit instruction. The outcomes of the empirical studies can be considered as the first unambiguous evidence on the (additional) benefits of combining laboratory and simulation activities in science education as compared to learning with laboratories and simulations alone.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The aim of the licentiate thesis is to examine researchers' information practices in research groups. The researchers were involved with study communication and media related issues within Social Sciences and Humanities Faculties. The theoretical framework of the study comprises the new holistic models of information seeking (for example: Meho and Tibbo, 2003; Seldén, 1999) and the collective aspects of information behaviour (Prekop, 2002 ; Talja, 2002; Talja and Hansen, 2006). The research questions are: 1. How do scholars seek information in research groups? 2 What kind of collaborative information behaviour occurs in the research groups? The research data was gathered by interviews and observations. Three meetings of a research group at the University of Tampere were observed during the autumn of 2004. The group members and the group leader of the research group were interviewed in the spring of 2005. The research group members and the group leader of a research group at the University of Jyväskylä were interviewed in the autumn of 2005. Altogether, two research group leaders and eight researchers were interviewed. The significance of the research group for information seeking is more important in closeknit research groups than in rather loose research groups. The significance of the research group for information seeking can be at least threefold. First, research group members can inform the group about relevant information resources and potential library or other information services. Second, the research group can to some extent compensate for the information seeking systems of libraries by distributing material and information resources. Third, information seeking can be carried out in collaboration in research groups. The significance of the research group was found to be most important in informing about new information services and marketing library systems. Recommendations from colleagues were often needed to mobilize researchers into using new library services. The significance of colleagues in informing about library services is in line with earlier studies. The present study showed that sometimes information from colleagues was regarded as more important than information distributed directly by the local library. A culture of information sharing, including mutual trust, seemed mainly to be reflected in collaboration and collaborative information seeking in the research groups studied. The timing of the onset of individual research seemed to be related to the information sharing culture and social networks in research groups. The simultaneous onset of the research work by group members seemed to promote the growth of unbiased collaboration, also in information seeking.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The results of a numerical study of premixed Hydrogen-air flows ignition by an oblique shock wave (OSW) stabilized by a wedge are presented, in situations when initial and boundary conditions are such that transition between the initial OSW and an oblique detonation wave (ODW) is observed. More precisely, the objectives of the paper are: (i) to identify the different possible structures of the transition region that exist between the initial OSW and the resulting ODW and (ii) to evidence the effect on the ODW of an abrupt decrease of the wedge angle in such a way that the final part of the wedge surface becomes parallel to the initial flow. For such a geometrical configuration and for the initial and boundary conditions considered, the overdriven detonation supported by the initial wedge angle is found to relax towards a Chapman-Jouguet detonation in the region where the wedge surface is parallel to the initial flow. Computations are performed using an adaptive, unstructured grid, finite volume computer code previously developed for the sake of the computations of high speed, compressible flows of reactive gas mixtures. Physico-chemical properties are functions of the local mixture composition, temperature and pressure, and they are computed using the CHEMKIN-II subroutines.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This work presents the implementation and comparison of three different techniques of three-dimensional computer vision as follows: • Stereo vision - correlation between two 2D images • Sensorial fusion - use of different sensors: camera 2D + ultrasound sensor (1D); • Structured light The computer vision techniques herein presented took into consideration the following characteristics: • Computational effort ( elapsed time for obtain the 3D information); • Influence of environmental conditions (noise due to a non uniform lighting, overlighting and shades); • The cost of the infrastructure for each technique; • Analysis of uncertainties, precision and accuracy. The option of using the Matlab software, version 5.1, for algorithm implementation of the three techniques was due to the simplicity of their commands, programming and debugging. Besides, this software is well known and used by the academic community, allowing the results of this work to be obtained and verified. Examples of three-dimensional vision applied to robotic assembling tasks ("pick-and-place") are presented.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This study concerns performance measurement and management in a collaborative network. Collaboration between companies has been increased in recent years due to the turbulent operating environment. The literature shows that there is a need for more comprehensive research on performance measurement in networks and the use of measurement information in their management. This study examines the development process and uses of a performance measurement system supporting performance management in a collaborative network. There are two main research questions: how to design a performance measurement system for a collaborative network and how to manage performance in a collaborative network. The work can be characterised as a qualitative single case study. The empirical data was collected in a Finnish collaborative network, which consists of a leading company and a reseller network. The work is based on five research articles applying various research methods. The research questions are examined at the network level and at the single network partner level. The study contributes to the earlier literature by producing new and deeper understanding of network-level performance measurement and management. A three-step process model is presented to support the performance measurement system design process. The process model has been tested in another collaborative network. The study also examines the factors affecting the process of designing the measurement system. The results show that a participatory development style, network culture, and outside facilitators have a positive effect on the design process. The study increases understanding of how to manage performance in a collaborative network and what kind of uses of performance information can be identified in a collaborative network. The results show that the performance measurement system is an applicable tool to manage the performance of a network. The results reveal that trust and openness increased during the utilisation of the performance measurement system, and operations became more transparent. The study also presents a management model that evaluates the maturity of performance management in a collaborative network. The model is a practical tool that helps to analyse the current stage of the performance management of a collaborative network and to develop it further.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The evolution of our society is impossible without a constant progress in life-important areas such as chemical engineering and technology. Innovation, creativity and technology are three main components driving the progress of chemistry further towards a sustainable society. Biomass, being an attractive renewable feedstock for production of fine chemicals, energy-rich materials and even transportation fuels, captures progressively new positions in the area of chemical technology. Knowledge of heterogeneous catalysis and chemical technology applied to transformation of biomass-derived substances will open doors for a sustainable economy and facilitates the discovery of novel environmentally-benign processes which probably will replace existing technologies in the era of biorefinary. Aqueous-phase reforming (APR) is regarded as a promising technology for production of hydrogen and liquids fuels from biomass-derived substances such as C3-C6 polyols. In the present work, aqueous-phase reforming of glycerol, xylitol and sorbitol was investigated in the presence of supported Pt catalysts. The catalysts were deposited on different support materials, including Al2O3, TiO2 and carbons. Catalytic measurements were performed in a laboratory-scale continuous fixedbed reactor. An advanced analytical approach was developed in order to identify reaction products and reaction intermediates in the APR of polyols. The influence of the substrate structure on the product formation and selectivity in the APR reaction was also investigated, showing that the yields of the desired products varied depending on the substrate chain length. Additionally, the influence of bioethanol additive in the APR of glycerol and sorbitol was studied. A reaction network was advanced explaining the formation of products and key intermediates. The structure sensitivity in the aqueous-phase reforming reaction was demonstrated using a series of platinum catalysts supported on carbon with different Pt cluster sizes in the continuous fixed-bed reactor. Furthermore, a correlation between texture physico-chemical properties of the catalysts and catalytic data was established. The effect of the second metal (Re, Cu) addition to Pt catalysts was investigated in the APR of xylitol showing a superior hydrocarbon formation on PtRe bimetallic catalysts compared to monometallic Pt. On the basis of the experimental data obtained, mathematical modeling of the reaction kinetics was performed. The developed model was proven to successfully describe experimental data on APR of sorbitol with good accuracy.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

As the world becomes more technologically advanced and economies become globalized, computer science evolution has become faster than ever before. With this evolution and globalization come the need for sustainable university curricula that adequately prepare graduates for life in the industry. Additionally, behavioural skills or “soft” skills have become just as important as technical abilities and knowledge or “hard” skills. The objective of this study was to investigate the current skill gap that exists between computer science university graduates and actual industry needs as well as the sustainability of current computer science university curricula by conducting a systematic literature review of existing publications on the subject as well as a survey of recently graduated computer science students and their work supervisors. A quantitative study was carried out with respondents from six countries, mainly Finland, 31 of the responses came from recently graduated computer science professionals and 18 from their employers. The observed trends suggest that a skill gap really does exist particularly with “soft” skills and that many companies are forced to provide additional training to newly graduated employees if they are to be successful at their jobs.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Many-core systems provide a great potential in application performance with the massively parallel structure. Such systems are currently being integrated into most parts of daily life from high-end server farms to desktop systems, laptops and mobile devices. Yet, these systems are facing increasing challenges such as high temperature causing physical damage, high electrical bills both for servers and individual users, unpleasant noise levels due to active cooling and unrealistic battery drainage in mobile devices; factors caused directly by poor energy efficiency. Power management has traditionally been an area of research providing hardware solutions or runtime power management in the operating system in form of frequency governors. Energy awareness in application software is currently non-existent. This means that applications are not involved in the power management decisions, nor does any interface between the applications and the runtime system to provide such facilities exist. Power management in the operating system is therefore performed purely based on indirect implications of software execution, usually referred to as the workload. It often results in over-allocation of resources, hence power waste. This thesis discusses power management strategies in many-core systems in the form of increasing application software awareness of energy efficiency. The presented approach allows meta-data descriptions in the applications and is manifested in two design recommendations: 1) Energy-aware mapping 2) Energy-aware execution which allow the applications to directly influence the power management decisions. The recommendations eliminate over-allocation of resources and increase the energy efficiency of the computing system. Both recommendations are fully supported in a provided interface in combination with a novel power management runtime system called Bricktop. The work presented in this thesis allows both new- and legacy software to execute with the most energy efficient mapping on a many-core CPU and with the most energy efficient performance level. A set of case study examples demonstrate realworld energy savings in a wide range of applications without performance degradation.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Today, the user experience and usability in software application are becoming a major design issue due to the adaptation of many processes using new technologies. Therefore, the study of the user experience and usability might be included in every software development project and, thus, they should be tested to get traceable results. As a result of different testing methods to evaluate the concepts, a non-expert on the topic might have doubts on which option he/she should opt for and how to interpret the outcomes of the process. This work aims to create a process to ease the whole testing methodology based on the process created by Seffah et al. and a supporting software tool to follow the procedure of these testing methods for the user experience and usability.