817 resultados para Design theory


Relevância:

30.00% 30.00%

Publicador:

Resumo:

Multiple models, methods and frameworks have been proposed to guide Design Science Research (DSR) application to address relevant classes of problems in Information Systems (IS) discipline. While much of the ambiguity around the research paradigm has been removed, only the surface has been scratched on DSR efforts where researcher takes an active role in organizational and industrial engagement to solve a specific problem and generalize the solution to a class of problems. Such DSR projects can have a significant impact on practice, link theories to real contexts and extend the scope of DSR. Considering these multiform settings, the implications to theorizing nor the crucial role of researcher in the interplay of DSR and IS projects have not been properly addressed. The emergent nature of such projects needs to be further investigated to reach such contributions for both theory and practice. This paper raises multiple theoretical, organizational and managerial considerations for a meta-level monitoring model for emergent DSR projects.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Contemporary IT standards are designed, not selected. Their design enacts a complex process that brings together a coalition of players. We examine the design of the SOAP standard to discover activity patterns in this design process. The paper reports these patterns as a precursor to developing a micro-level process theory for designing IT standards.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

In this work we introduce a new mathematical tool for optimization of routes, topology design, and energy efficiency in wireless sensor networks. We introduce a vector field formulation that models communication in the network, and routing is performed in the direction of this vector field at every location of the network. The magnitude of the vector field at every location represents the density of amount of data that is being transited through that location. We define the total communication cost in the network as the integral of a quadratic form of the vector field over the network area. With the above formulation, we introduce a mathematical machinery based on partial differential equations very similar to the Maxwell's equations in electrostatic theory. We show that in order to minimize the cost, the routes should be found based on the solution of these partial differential equations. In our formulation, the sensors are sources of information, and they are similar to the positive charges in electrostatics, the destinations are sinks of information and they are similar to negative charges, and the network is similar to a non-homogeneous dielectric media with variable dielectric constant (or permittivity coefficient). In one of the applications of our mathematical model based on the vector fields, we offer a scheme for energy efficient routing. Our routing scheme is based on changing the permittivity coefficient to a higher value in the places of the network where nodes have high residual energy, and setting it to a low value in the places of the network where the nodes do not have much energy left. Our simulations show that our method gives a significant increase in the network life compared to the shortest path and weighted shortest path schemes. Our initial focus is on the case where there is only one destination in the network, and later we extend our approach to the case where there are multiple destinations in the network. In the case of having multiple destinations, we need to partition the network into several areas known as regions of attraction of the destinations. Each destination is responsible for collecting all messages being generated in its region of attraction. The complexity of the optimization problem in this case is how to define regions of attraction for the destinations and how much communication load to assign to each destination to optimize the performance of the network. We use our vector field model to solve the optimization problem for this case. We define a vector field, which is conservative, and hence it can be written as the gradient of a scalar field (also known as a potential field). Then we show that in the optimal assignment of the communication load of the network to the destinations, the value of that potential field should be equal at the locations of all the destinations. Another application of our vector field model is to find the optimal locations of the destinations in the network. We show that the vector field gives the gradient of the cost function with respect to the locations of the destinations. Based on this fact, we suggest an algorithm to be applied during the design phase of a network to relocate the destinations for reducing the communication cost function. The performance of our proposed schemes is confirmed by several examples and simulation experiments. In another part of this work we focus on the notions of responsiveness and conformance of TCP traffic in communication networks. We introduce the notion of responsiveness for TCP aggregates and define it as the degree to which a TCP aggregate reduces its sending rate to the network as a response to packet drops. We define metrics that describe the responsiveness of TCP aggregates, and suggest two methods for determining the values of these quantities. The first method is based on a test in which we drop a few packets from the aggregate intentionally and measure the resulting rate decrease of that aggregate. This kind of test is not robust to multiple simultaneous tests performed at different routers. We make the test robust to multiple simultaneous tests by using ideas from the CDMA approach to multiple access channels in communication theory. Based on this approach, we introduce tests of responsiveness for aggregates, and call it CDMA based Aggregate Perturbation Method (CAPM). We use CAPM to perform congestion control. A distinguishing feature of our congestion control scheme is that it maintains a degree of fairness among different aggregates. In the next step we modify CAPM to offer methods for estimating the proportion of an aggregate of TCP traffic that does not conform to protocol specifications, and hence may belong to a DDoS attack. Our methods work by intentionally perturbing the aggregate by dropping a very small number of packets from it and observing the response of the aggregate. We offer two methods for conformance testing. In the first method, we apply the perturbation tests to SYN packets being sent at the start of the TCP 3-way handshake, and we use the fact that the rate of ACK packets being exchanged in the handshake should follow the rate of perturbations. In the second method, we apply the perturbation tests to the TCP data packets and use the fact that the rate of retransmitted data packets should follow the rate of perturbations. In both methods, we use signature based perturbations, which means packet drops are performed with a rate given by a function of time. We use analogy of our problem with multiple access communication to find signatures. Specifically, we assign orthogonal CDMA based signatures to different routers in a distributed implementation of our methods. As a result of orthogonality, the performance does not degrade because of cross interference made by simultaneously testing routers. We have shown efficacy of our methods through mathematical analysis and extensive simulation experiments.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Scheduling a set of jobs over a collection of machines to optimize a certain quality-of-service measure is one of the most important research topics in both computer science theory and practice. In this thesis, we design algorithms that optimize {\em flow-time} (or delay) of jobs for scheduling problems that arise in a wide range of applications. We consider the classical model of unrelated machine scheduling and resolve several long standing open problems; we introduce new models that capture the novel algorithmic challenges in scheduling jobs in data centers or large clusters; we study the effect of selfish behavior in distributed and decentralized environments; we design algorithms that strive to balance the energy consumption and performance.

The technically interesting aspect of our work is the surprising connections we establish between approximation and online algorithms, economics, game theory, and queuing theory. It is the interplay of ideas from these different areas that lies at the heart of most of the algorithms presented in this thesis.

The main contributions of the thesis can be placed in one of the following categories.

1. Classical Unrelated Machine Scheduling: We give the first polygorithmic approximation algorithms for minimizing the average flow-time and minimizing the maximum flow-time in the offline setting. In the online and non-clairvoyant setting, we design the first non-clairvoyant algorithm for minimizing the weighted flow-time in the resource augmentation model. Our work introduces iterated rounding technique for the offline flow-time optimization, and gives the first framework to analyze non-clairvoyant algorithms for unrelated machines.

2. Polytope Scheduling Problem: To capture the multidimensional nature of the scheduling problems that arise in practice, we introduce Polytope Scheduling Problem (\psp). The \psp problem generalizes almost all classical scheduling models, and also captures hitherto unstudied scheduling problems such as routing multi-commodity flows, routing multicast (video-on-demand) trees, and multi-dimensional resource allocation. We design several competitive algorithms for the \psp problem and its variants for the objectives of minimizing the flow-time and completion time. Our work establishes many interesting connections between scheduling and market equilibrium concepts, fairness and non-clairvoyant scheduling, and queuing theoretic notion of stability and resource augmentation analysis.

3. Energy Efficient Scheduling: We give the first non-clairvoyant algorithm for minimizing the total flow-time + energy in the online and resource augmentation model for the most general setting of unrelated machines.

4. Selfish Scheduling: We study the effect of selfish behavior in scheduling and routing problems. We define a fairness index for scheduling policies called {\em bounded stretch}, and show that for the objective of minimizing the average (weighted) completion time, policies with small stretch lead to equilibrium outcomes with small price of anarchy. Our work gives the first linear/ convex programming duality based framework to bound the price of anarchy for general equilibrium concepts such as coarse correlated equilibrium.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

BACKGROUND/AIMS: The obesity epidemic has spread to young adults, and obesity is a significant risk factor for cardiovascular disease. The prominence and increasing functionality of mobile phones may provide an opportunity to deliver longitudinal and scalable weight management interventions in young adults. The aim of this article is to describe the design and development of the intervention tested in the Cell Phone Intervention for You study and to highlight the importance of adaptive intervention design that made it possible. The Cell Phone Intervention for You study was a National Heart, Lung, and Blood Institute-sponsored, controlled, 24-month randomized clinical trial comparing two active interventions to a usual-care control group. Participants were 365 overweight or obese (body mass index≥25 kg/m2) young adults. METHODS: Both active interventions were designed based on social cognitive theory and incorporated techniques for behavioral self-management and motivational enhancement. Initial intervention development occurred during a 1-year formative phase utilizing focus groups and iterative, participatory design. During the intervention testing, adaptive intervention design, where an intervention is updated or extended throughout a trial while assuring the delivery of exactly the same intervention to each cohort, was employed. The adaptive intervention design strategy distributed technical work and allowed introduction of novel components in phases intended to help promote and sustain participant engagement. Adaptive intervention design was made possible by exploiting the mobile phone's remote data capabilities so that adoption of particular application components could be continuously monitored and components subsequently added or updated remotely. RESULTS: The cell phone intervention was delivered almost entirely via cell phone and was always-present, proactive, and interactive-providing passive and active reminders, frequent opportunities for knowledge dissemination, and multiple tools for self-tracking and receiving tailored feedback. The intervention changed over 2 years to promote and sustain engagement. The personal coaching intervention, alternatively, was primarily personal coaching with trained coaches based on a proven intervention, enhanced with a mobile application, but where all interactions with the technology were participant-initiated. CONCLUSION: The complexity and length of the technology-based randomized clinical trial created challenges in engagement and technology adaptation, which were generally discovered using novel remote monitoring technology and addressed using the adaptive intervention design. Investigators should plan to develop tools and procedures that explicitly support continuous remote monitoring of interventions to support adaptive intervention design in long-term, technology-based studies, as well as developing the interventions themselves.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The original concept was to create a 'simulation' which would provide trainee teachers, specializing in Information and Communications Technology (ICT) with the opportunity to explore a primary school environment. Within the simulation, factors affecting the development and implementation of ICT would be modelled so that trainees would be able to develop the skills, knowledge and understanding necessary to identify appropriate strategies to overcome the limitations. To this end, we have developed Allsorts Primary - the prototype of a simulated interactive environment, representing a typical primary school

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This paper presents the findings of an experiment which looked at the effects of performing applied tasks (action learning) prior to the completion of the theoretical learning of these tasks (explanation-based learning), and vice-versa. The applied tasks took the form of laboratories for the Object-Oriented Analysis and Design (OOAD) course, theoretical learning was via lectures.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

While E-learning technologies are continuously developing, there are number of emerging issues and challenges that have significant impact on e-learning research and design. These include educational, technological, sociological, and psychological viewpoints. The extant literature points out that a large number of existing E-learning systems have problems with offering reusable, personalized and learner-centric content. While developers are placing emphasis on the technology aspects of e-learning, critical conceptual and pedagogical issues are often ignored. This paper will reports on our research in design and development of personalised e-learning systems and some of the challenges and issues faced.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

A nomadic collaborative partnership model for a community of practice (CoP) in Design for Learning (D4L) can facilitate successful innovation and continuing appraisals of effective professional practice, stimulated by a 'critical friend' assigned to the project. This paper reports on e-learning case studies collected by the JISC-funded UK eLIDA CAMEL Design for Learning Project. The project implemented and evaluated learning design (LD) tools in higher and further education within the JISC Design for Learning pedagogic programme (2006-07). Project partners trialled professional user evaluations of innovative e-learning tools with learning design function, collecting D4L case studies and LD sequences in post-16/HE contexts using LAMS and Moodle. The project brought together learning activity sequences within a collaborative e-learning community of practice based on the CAMEL (Collaborative Approaches to the Management of e-Learning) model, contributing to international D4L developments. This paper provides an overview of project outputs in e-learning innovations, including evaluations from teachers and students. The paper explores intentionality in the development of a CoP in design for learning, reporting on trials of LD and social software that bridged tensions between formalised intra-institutional e-learning relationships and inter-institutional professional project team dynamic D4L practitioner interactions. Following a brief report of D4L case studies and feedback, the catalytic role of the 'critical friend' is highlighted and recommended as a key ingredient in the successful development of a nomadic model of communities of practice for managing professional e-learning projects. eLIDA CAMEL Partners included the Association of Learning Technology (ALT), JISC infoNet, three universities and five FE/Sixth Form Colleges. Results reported to JISC demonstrated D4L e-learning innovations by practitioners, illuminated by the role of the 'critical friend'. The project also benefited from formal case study evaluations and the leading work of ALT and JISC infoNet in the development of the CAMEL model.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The article presents cost modeling results from the application of the Genetic-Causal cost modeling principle. Industrial results from redesign are also presented to verify the opportunity for early concept cost optimization by using Genetic-Causal cost drivers to guide the conceptual design process for structural assemblies. The acquisition cost is considered through the modeling of the recurring unit cost and non-recurring design cost. The operational cost is modeled relative to acquisition cost and fuel burn for predominately metal or composites designs. The main contribution of this study is the application of the Genetic-Causal principle to the modeling of cost, helping to understand how conceptual design parameters impact on cost, and linking that to customer requirements and life cycle cost.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Design-build experiences (DBEs) are an essential element of any programme based on the CDIO methodology. They enable students to develop practical hands-on skills, they enable the learning of theory by stealth and they provide a forum for developing professional skills such as team working and project management. The hands-on aspect of certain DBEs has significant risk associated with it, which must be addressed through the formal evaluation of risks and the development of a methodology for controlling them. This paper considers the aspects of design-build experiences that may impact on student safety. In particular, it examines the risk associated with each of the four stages of CDIO and gives examples of risks which may commonly apply across engineering disciplines. A system for assessing and controlling the risks in any particular DBE is presented and the paper finishes by discussing the significance of health and safety in the educational environment.