931 resultados para Service Programming Environment
Resumo:
In this paper, we propose three relay selection schemes for full-duplex heterogeneous networks in the presence of multiple cognitive radio eavesdroppers. In this setup, the cognitive small-cell nodes (secondary network) can share the spectrum licensed to the macro-cell system (primary network) on the condition that the quality-of-service of the primary network is always satisfied subjected to its outage probability constraint. The messages are delivered from one small-cell base station to the destination with the help of full-duplex small-cell base stations, which act as relay nodes. Based on the availability of the network’s channel state information at the secondary information source, three different selection criteria for full-duplex relays, namely: 1) partial relay selection; 2) optimal relay selection; and 3) minimal self-interference relay selection, are proposed. We derive the exact closed-form and asymptotic expressions of the secrecy outage probability for the three criteria under the attack of non-colluding/colluding eavesdroppers. We demonstrate that the optimal relay selection scheme outperforms the partial relay selection and minimal self-interference relay selection schemes at the expense of acquiring full channel state information knowledge. In addition, increasing the number of the full-duplex small-cell base stations can improve the security performance. At the illegitimate side, deploying colluding eavesdroppers and increasing the number of eavesdroppers put the confidential information at a greater risk. Besides, the transmit power and the desire outage probability of the primary network have great influences on the secrecy outage probability of the secondary network.
Resumo:
There are two types of work typically performed in services which differ in the degree of control management has over when the work must be done. Serving customers, an activity that can occur only when customers are in the system is, by its nature, uncontrollable work. In contrast, the execution of controllable work does not require the presence of customers, and is work over which management has some degree of temporal control. This paper presents two integer programming models for optimally scheduling controllable work simultaneously with shifts. One model explicitly defines variables for the times at which controllable work may be started, while the other uses implicit modeling to reduce the number of variables. In an initial experiment of 864 test problems, the latter model yielded optimal solutions in approximately 81 percent of the time required by the former model. To evaluate the impact on customer service of having front-line employees perform controllable work, a second experiment was conducted simulating 5,832 service delivery systems. The results show that controllable work offers a useful means of improving labor utilization. Perhaps more important, it was found that having front-line employees perform controllable work did not degrade the desired level of customer service.
Resumo:
Thesis (Ph.D.)--University of Washington, 2016-08
Resumo:
Conventional taught learning practices often experience difficulties in keeping students motivated and engaged. Video games, however, are very successful at sustaining high levels of motivation and engagement through a set of tasks for hours without apparent loss of focus. In addition, gamers solve complex problems within a gaming environment without feeling fatigue or frustration, as they would typically do with a comparable learning task. Based on this notion, the academic community is keen on exploring methods that can deliver deep learner engagement and has shown increased interest in adopting gamification – the integration of gaming elements, mechanics, and frameworks into non-game situations and scenarios – as a means to increase student engagement and improve information retention. Its effectiveness when applied to education has been debatable though, as attempts have generally been restricted to one-dimensional approaches such as transposing a trivial reward system onto existing teaching materials and/or assessments. Nevertheless, a gamified, multi-dimensional, problem-based learning approach can yield improved results even when applied to a very complex and traditionally dry task like the teaching of computer programming, as shown in this paper. The presented quasi-experimental study used a combination of instructor feedback, real time sequence of scored quizzes, and live coding to deliver a fully interactive learning experience. More specifically, the “Kahoot!” Classroom Response System (CRS), the classroom version of the TV game show “Who Wants To Be A Millionaire?”, and Codecademy’s interactive platform formed the basis for a learning model which was applied to an entry-level Python programming course. Students were thus allowed to experience multiple interlocking methods similar to those commonly found in a top quality game experience. To assess gamification’s impact on learning, empirical data from the gamified group were compared to those from a control group who was taught through a traditional learning approach, similar to the one which had been used during previous cohorts. Despite this being a relatively small-scale study, the results and findings for a number of key metrics, including attendance, downloading of course material, and final grades, were encouraging and proved that the gamified approach was motivating and enriching for both students and instructors.
Resumo:
Thesis (Ph.D.)--University of Washington, 2016-08
Resumo:
As robot imitation learning is beginning to replace conventional hand-coded approaches in programming robot behaviors, much work is focusing on learning from the actions of demonstrators. We hypothesize that in many situations, procedural tasks can be learned more effectively by observing object behaviors while completely ignoring the demonstrator's motions. To support studying this hypothesis and robot imitation learning in general, we built a software system named SMILE that is a simulated 3D environment. In this virtual environment, both a simulated robot and a user-controlled demonstrator can manipulate various objects on a tabletop. The demonstrator is not embodied in SMILE, and therefore a recorded demonstration appears as if the objects move on their own. In addition to recording demonstrations, SMILE also allows programing the simulated robot via Matlab scripts, as well as creating highly customizable objects for task scenarios via XML. This report describes the features and usages of SMILE.
Resumo:
Sustainability in buildings, while reducing the impact on the environment, contributes to the promotion of social welfare, to increase the health and productivity of occupants. The search for a way of build that meets the aspirations and development of humanity without, however, represent degradation of the environment, has become the great challenge of contemporary architecture. It is considered that the incorporation of principles that provide a sustainable building with careful choices of design solutions contribute to a better economic and thermal performance of the building, as well as functional and psychological comfort to its users. Based on this general understanding, this paper presents an architecture project aimed to health care whose the solutions adopted follow carefully the relevant legislation and sets his sights on the theme of sustainability. The methodology began with studies on the themes of verification service of deaths, sustainability and those application in construction developed through research in academic studies and analysis of architectural projects, using them like reference for the solutions adopted. Within the project analysis was performed a visit to the verification service of deaths in the city of Palmas in Tocantins, subsidizing information that, plus the relevant legislation, led to functional programming and pre-dimensional of the building to be designed. The result of this programming environments were individual records with information from environmental restrictions, space required for the development of activities, desirable flow and sustainability strategies, that can be considered as the first product of relevance of the professional master's degree. Finally we have outlined the basic design architecture of a Verification Service of Death SVO/RN (in portuguese), whose process of projecting defined as a guiding line of work four points: the use of bioclimatic architecture as the main feature projectual, the use of resources would provide minimal harm to the environment, the use of modulation and structure to the building as a form of rationalization and finally the search for solutions that ensure environmental and psychological comfort to users. Importantly to highlight that, besides owning a rare theme in literature that refers to architectural projects, the whole project was drawn up with foundations in projective criteria that contribute to environmental sustainability, with emphasis on thermal performance, energy efficiency and reuse of rainwater
Resumo:
As a result of globalization, two thirds of the world’s business takes place nowadays in the service sector. In line, professional service firms are growing their share of the global service production. However, saturation of the professional service sector has forced professional service firms to search for more heuristic ways to conduct business in the international markets. By leveraging effectively the firm’s professionals, a professional service firm can lower its costs to clients and simultaneously generate additional value for the company and thus gain competitive advantage. Even though the academic field has shown growing interest towards services for decades, the fields of service productization and service internationalization are heavily understudied even today. Hence, the objective of this study was to contribute to the research on professional service internationalization and productization. The study concentrated on examining the impact that productization has on knowledge sharing and leveraging in professional service firms operating internationally. The research question focused on examining what implications productization has on knowledge transfer and leveraging during professional service internationalization by leaning on the existing research and on an empirical research. The empirical research was conducted as a single case study within a professional service firm operating in debt-related administrative service business. The case company is one of the leading operators in its field of business and therefore offered a fruitful environment to observe and analyze the topics in question. Additionally, the case company has a strong international presence and a large scale of operations in the selected markets, Finland, Norway and Sweden. Based on the previous literature and on the empirical research, this study found that for professional service firms to efficiently utilize individual, tacit knowledge, in its internationalization processes, it must be shared with the whole organization. By exploiting productization as a knowledge leveraging mechanism, a PSF can apply and transfer knowledge profoundly during its internationalization processes that would otherwise be difficult to tap into. Productization might not be sufficient alone, but by complementing it with a favorable organizational structure and culture, and by encouraging open communication, a PSF may take advantage of the whole potential that productization has to offer.
Resumo:
Maternal obesity has been shown to increase the risk for adverse reproductive health outcomes such as gestational diabetes, hypertension, and preeclampsia. Moreover, several studies have indicated that overnutrition and maternal obesity adversely program the development of offspring by predisposing them to obesity and other chronic diseases later in life. The exact molecular mechanisms leading to developmental programming are not known, but it has recently been suggested that obesity-related low-grade inflammation, gut microbiota and epigenetic gene regulation (in particularly DNA methylation) participate in the developmental programming phenomenon. The aim of this thesis was to evaluate the effect of diet, dietary counseling and probiotic intervention during pregnancy in endorsing favorable developmental programming. The study population consisted of 256 mother-child pairs participating in a prospective, double-blinded dietary counselling and probiotic intervention (Lactobacillus rhamnosus GG and Bifidobacterium lactis Bb12) NAMI (Nutrition, Allergy, Mucosal immunology and Intestinal microbiota) study. Further overweight women were recruited from maternal welfare clinics in the area of Southwest Finland and from the prenatal outpatient clinic at Turku University Hospital. Dietary counseling was aimed to modify women’s dietary intake to comply with the recommended intake for pregnant women. Specifically, counseling aimed to affect the type of fat consumed and to increase the amount of fiber in the women’s diets. Leptin concentration was used as a marker for obesity-related low-grade inflammation, antioxidant vitamin status as an efficiency marker for dietary counselling and epigenetic DNA methylation of obesity related genes as a marker for probiotics influence. Results revealed that dietary intake may modify obesity-associated low-grade inflammation as measured by serum leptin concentration. Specifically, dietary fiber intake may lower leptin concentration in women, whereas the intakes of saturated fatty acids and sucrose have an opposite effect. Neither dietary counselling nor probiotic intervention modified leptin concentration in women, but probiotics tended to increase children’s leptin concentration. Dietary counseling was an efficient tool for improving antioxidant vitamin intake in women, which was reflected in the breast milk vitamin concentration. Probiotic intervention affected DNA methylation of dozens of obesity and weight gain related genes both in women and their children. Altogether these results indicate that dietary components, dietary counseling and probiotic supplementation during pregnancy may modify the intrauterine environment towards favorable developmental programming.
Resumo:
Processors with large numbers of cores are becoming commonplace. In order to utilise the available resources in such systems, the programming paradigm has to move towards increased parallelism. However, increased parallelism does not necessarily lead to better performance. Parallel programming models have to provide not only flexible ways of defining parallel tasks, but also efficient methods to manage the created tasks. Moreover, in a general-purpose system, applications residing in the system compete for the shared resources. Thread and task scheduling in such a multiprogrammed multithreaded environment is a significant challenge. In this thesis, we introduce a new task-based parallel reduction model, called the Glasgow Parallel Reduction Machine (GPRM). Our main objective is to provide high performance while maintaining ease of programming. GPRM supports native parallelism; it provides a modular way of expressing parallel tasks and the communication patterns between them. Compiling a GPRM program results in an Intermediate Representation (IR) containing useful information about tasks, their dependencies, as well as the initial mapping information. This compile-time information helps reduce the overhead of runtime task scheduling and is key to high performance. Generally speaking, the granularity and the number of tasks are major factors in achieving high performance. These factors are even more important in the case of GPRM, as it is highly dependent on tasks, rather than threads. We use three basic benchmarks to provide a detailed comparison of GPRM with Intel OpenMP, Cilk Plus, and Threading Building Blocks (TBB) on the Intel Xeon Phi, and with GNU OpenMP on the Tilera TILEPro64. GPRM shows superior performance in almost all cases, only by controlling the number of tasks. GPRM also provides a low-overhead mechanism, called “Global Sharing”, which improves performance in multiprogramming situations. We use OpenMP, as the most popular model for shared-memory parallel programming as the main GPRM competitor for solving three well-known problems on both platforms: LU factorisation of Sparse Matrices, Image Convolution, and Linked List Processing. We focus on proposing solutions that best fit into the GPRM’s model of execution. GPRM outperforms OpenMP in all cases on the TILEPro64. On the Xeon Phi, our solution for the LU Factorisation results in notable performance improvement for sparse matrices with large numbers of small blocks. We investigate the overhead of GPRM’s task creation and distribution for very short computations using the Image Convolution benchmark. We show that this overhead can be mitigated by combining smaller tasks into larger ones. As a result, GPRM can outperform OpenMP for convolving large 2D matrices on the Xeon Phi. Finally, we demonstrate that our parallel worksharing construct provides an efficient solution for Linked List processing and performs better than OpenMP implementations on the Xeon Phi. The results are very promising, as they verify that our parallel programming framework for manycore processors is flexible and scalable, and can provide high performance without sacrificing productivity.
Resumo:
In seeking to fulfil the ambition of the 2003 genetics white paper, Our Inheritance, Our Future, to ‘mainstream’ genetic knowledge and practices, the Department of Health provided start-up funding for pilot services in various clinical areas, including seven cancer genetics projects. To help to understand the challenges encountered by such an attempt at reconfiguring the organization and delivery of services in this field, a programme-level evaluation of the genetics projects was commissioned to consider the organizational issues faced. Using a qualitative approach, this research has involved comparative case-study work in 11 of the pilot sites, including four of the seven cancer genetics pilots. In this paper, the researchers present early findings from their work, focusing in particular on the cancer genetics pilots. They consider some of the factors that have influenced how the pilots have sought to address pre-existing sector, organizational and professional boundaries to these new ways of working. The article examines the relationship between these factors and the extent to which pilots have succeeded in setting up boundary-spanning services, dealing with human-resource issues and creating sustainable, ‘mainstreamed’ provision which attracts ongoing funding in a volatile NHS commissioning environment where funding priorities do not always favour preventive, risk-assessment services.
Resumo:
The SimProgramming teaching approach has the goal to help students overcome their learning difficulties in the transition from entry-level to advanced computer programming and prepare them for real-world labour environments, adopting learning strategies. It immerses learners in a businesslike learning environment, where students develop a problem-based learning activity with a specific set of tasks, one of which is filling weekly individual forms. We conducted thematic analysis of 401 weekly forms, to identify the students’ strategies for self-regulation of learning during assignment. The students are adopting different strategies in each phase of the approach. The early phases are devoted to organization and planning, later phases focus on applying theoretical knowledge and hands-on programming. Based on the results, we recommend the development of educational practices to help students conduct self-reflection of their performance during tasks.
Resumo:
Trabalho apresentado em PAEE/ALE’2016, 8th International Symposium on Project Approaches in Engineering Education (PAEE) and 14th Active Learning in Engineering Education Workshop (ALE)
Resumo:
Copernicus is a European system for monitoring the Earth. COPERNICUS-CMEMS products and services are meant to serve all marine applications: Marine resources, Maritime safety, Coastal and Marine Environment, Seasonal Forecast & Climate. The service is ambitious as the ocean is complex and many processes are involved, from physical oceanography, biology, geology, ocean-atmosphere fluxes, solar radiations, moon induced tides, anthropic activity. A multi-platform approach is essential, taking into account sea-level stations, coastal buoys, HF radars, river flows, drifting buoys, sea-mammal or fishes fitted with sensors, vessels, gliders, floats.