915 resultados para Two-level scheduling and optimization


Relevância:

100.00% 100.00%

Publicador:

Resumo:

A growing body of research focuses on the expanding roles of NGOs in global and supranational governance. The research emphasizes the increasing number of participation patterns of NGOs in policymaking and cross-national cooperation. It has produced important insights into the evolving political role of NGOs and their growing involvement in governance. The focus on activities at a transnational level has, however, lead to the virtual exclusion of research on other levels of governance. It has not been possible to tell whether the locus of their political activity is shifting from the national to the transnational environment, or whether it is simply broadening. Missing from the literature is an examination of the variety of cooperative relationships, including those between NGOs, which impact policy involvement across different levels of governance. To bridge this gap, I address two key questions: 1) Is the strategy of cooperation among NGOs a common feature of social movement activity across levels of governance, and if so, what does the structure of cooperation look like? 2) What impact, if any, does cooperation have on the expanding political involvement of NGOS, both within and across levels of governance? Using data from an original survey of migrant and refugee organizations across much of Europe, I test several hypotheses that shed light on these issues. The findings broadly indicate that 1) Cooperation is a widely-used strategy across levels of governance, 2) Cooperation with specific sets of actors increases the likelihood of NGO involvement at different levels of governance. Specifically, cooperation with EU-level actors increases the likelihood of national-level involvement, and 3) NGOs are more likely to extend their involvement across a range of institutions if they cooperate with a broad range of actors.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

In this thesis, we consider four different scenarios of interest in modern satellite communications. For each scenario, we will propose the use of advanced solutions aimed at increasing the spectral efficiency of the communication links. First, we will investigate the optimization of the current standard for digital video broadcasting. We will increase the symbol rate of the signal and determine the optimal signal bandwidth. We will apply the time packing technique and propose a specifically design constellation. We will then compare some receiver architectures with different performance and complexity. The second scenario still addresses broadcast transmissions, but in a network composed of two satellites. We will compare three alternative transceiver strategies, namely, signals completely overlapped in frequency, frequency division multiplexing, and the Alamouti space-time block code, and, for each technique, we will derive theoretical results on the achievable rates. We will also evaluate the performance of said techniques in three different channel models. The third scenario deals with the application of multiuser detection in multibeam satellite systems. We will analyze a case in which the users are near the edge of the coverage area and, hence, they experience a high level of interference from adjacent cells. Also in this case, three different approaches will be compared. A classical approach in which each beam carries information for a user, a cooperative solution based on time division multiplexing, and the Alamouti scheme. The information theoretical analysis will be followed by the study of practical coded schemes. We will show that the theoretical bounds can be approached by a properly designed code or bit mapping. Finally, we will consider an Earth observation scenario, in which data is generated on the satellite and then transmitted to the ground. We will study two channel models, taking into account one or two transmit antennas, and apply techniques such as time and frequency packing, signal predistortion, multiuser detection and the Alamouti scheme.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

This paper complements the preceding one by Clarke et al, which looked at the long-term impact of retail restructuring on consumer choice at the local level. Whereas the previous paper was based on quantitative evidence from survey research, this paper draws on the qualitative phases of the same three-year study, and in it we aim to understand how the changing forms of retail provision are experienced at the neighbourhood and household level. The empirical material is drawn from focus groups, accompanied shopping trips, diaries, interviews, and kitchen visits with eight households in two contrasting neighbourhoods in the Portsmouth area. The data demonstrate that consumer choice involves judgments of taste, quality, and value as well as more ‘objective’ questions of convenience, price, and accessibility. These judgments are related to households’ differential levels of cultural capital and involve ethical and moral considerations as well as more mundane considerations of practical utility. Our evidence suggests that many of the terms that are conventionally advanced as explanations of consumer choice (such as ‘convenience’, ‘value’, and ‘habit’) have very different meanings according to different household circumstances. To understand these meanings requires us to relate consumers’ at-store behaviour to the domestic context in which their consumption choices are embedded. Bringing theories of practice to bear on the nature of consumer choice, our research demonstrates that consumer choice between stores can be understood in terms of accessibility and convenience, whereas choice within stores involves notions of value, price, and quality. We also demonstrate that choice between and within stores is strongly mediated by consumers’ household contexts, reflecting the extent to which shopping practices are embedded within consumers’ domestic routines and complex everyday lives. The paper concludes with a summary of the overall findings of the project, and with a discussion of the practical and theoretical implications of the study.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

This paper focuses on minimizing printed circuit board (PCB) assembly time for a chipshootermachine, which has a movable feeder carrier holding components, a movable X–Y table carrying a PCB, and a rotary turret with multiple assembly heads. The assembly time of the machine depends on two inter-related optimization problems: the component sequencing problem and the feeder arrangement problem. Nevertheless, they were often regarded as two individual problems and solved separately. This paper proposes two complete mathematical models for the integrated problem of the machine. The models are verified by two commercial packages. Finally, a hybrid genetic algorithm previously developed by the authors is presented to solve the model. The algorithm not only generates the optimal solutions quickly for small-sized problems, but also outperforms the genetic algorithms developed by other researchers in terms of total assembly time.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

From a manufacturing perspective, the efficiency of manufacturing operations (such as process planning and production scheduling) are the key element for enhancing manufacturing competence. Process planning and production scheduling functions have been traditionally treated as two separate activities, and have resulted in a range of inefficiencies. These include infeasible process plans, non-available/overloaded resources, high production costs, long production lead times, and so on. Above all, it is unlikely that the dynamic changes can be efficiently dealt with. Despite much research has been conducted to integrate process planning and production scheduling to generate optimised solutions to improve manufacturing efficiency, there is still a gap to achieve the competence required for the current global competitive market. In this research, the concept of multi-agent system (MAS) is adopted as a means to address the aforementioned gap. A MAS consists of a collection of intelligent autonomous agents able to solve complex problems. These agents possess their individual objectives and interact with each other to fulfil the global goal. This paper describes a novel use of an autonomous agent system to facilitate the integration of process planning and production scheduling functions to cope with unpredictable demands, in terms of uncertainties in product mix and demand pattern. The novelty lies with the currency-based iterative agent bidding mechanism to allow process planning and production scheduling options to be evaluated simultaneously, so as to search for an optimised, cost-effective solution. This agent based system aims to achieve manufacturing competence by means of enhancing the flexibility and agility of manufacturing enterprises.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

In this paper, we use plant-level data from two Indian industries, namely, electrical machinery and textiles, to examine the empirical relationship between structural reforms like abandonment of entry restrictions to the product market, competition and firm-level productivity and efficiency. These industries have faced different sets of policies since Independence but both were restricted in the adoption of technology and in the development of optimal scales of production. They also belonged to the first set of industries that benefited from the liberalization process started in the 1980s. Our results suggest that both the industries have improved their efficiency and scales of operation by the turn of the century. However, the process of adjustment seems to have been worked out more fully for electrical machinery. We also find evidence of spatial fragmentation of the market as late as 2000–2001. Gains in labour productivity were much more evident in states that either have a strong history of industrial activity or those that have experienced significant improvements in business environment since 1991.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

We set out to distinguish level 1 (VPT-1) and level 2 (VPT-2) perspective taking with respect to the embodied nature of the underlying processes as well as to investigate their dependence or independence of response modality (motor vs. verbal). While VPT-1 reflects understanding of what lies within someone else’s line of sight, VPT-2 involves mentally adopting someone else’s spatial point of view. Perspective taking is a high-level conscious and deliberate mental transformation that is crucially placed at the convergence of perception, mental imagery, communication, and even theory of mind in the case of VPT-2. The differences between VPT-1 and VPT-2 mark a qualitative boundary between humans and apes, with the latter being capable of VPT-1 but not of VPT-2. However, our recent data showed that VPT-2 is best conceptualized as the deliberate simulation or emulation of a movement, thus underpinning its embodied origins. In the work presented here we compared VPT-2 to VPT-1 and found that VPT-1 is not at all, or very differently embodied. In a second experiment we replicated the qualitatively different patterns for VPT-1 and VPT-2 with verbal responses that employed spatial prepositions. We conclude that VPT-1 is the cognitive process that subserves verbal localizations using “in front” and “behind,” while VPT-2 subserves “left” and “right” from a perspective other than the egocentric. We further conclude that both processes are grounded and situated, but only VPT-2 is embodied in the form of a deliberate movement simulation that increases in mental effort with distance and incongruent proprioception. The differences in cognitive effort predict differences in the use of the associated prepositions. Our findings, therefore, shed light on the situated, grounded and embodied basis of spatial localizations and on the psychology of their use.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The purpose of this study was to examine the relationship between the spiritual well-being of nurses and its influence on their attitudes toward providing spiritual care to patients. Two research instruments and a demographic data form were used for the survey. Using a descriptive design, tbe Spiritual Well-Being Scale, the Health Professional's Spiritual Role Scale, and the demographic data form were administered to 100 registered nurses from a large South Florida teaching hospital. The findings indicated a significantly positive correlation between the overall Spiritual Well-Being Scale and the Health Professional's Spiritual Role Scale (r = 0.52; p =.005). Significant differences were found between correlation of nurses' levels of spiritual well-being and all sociodemographic factors except for the three Age Groups and for religious affiliations. Findings have implications for how nurses should be trained in meeting patients total needs. ^

Relevância:

100.00% 100.00%

Publicador:

Resumo:

This dissertation presents and evaluates a methodology for scheduling medical application workloads in virtualized computing environments. Such environments are being widely adopted by providers of "cloud computing" services. In the context of provisioning resources for medical applications, such environments allow users to deploy applications on distributed computing resources while keeping their data secure. Furthermore, higher level services that further abstract the infrastructure-related issues can be built on top of such infrastructures. For example, a medical imaging service can allow medical professionals to process their data in the cloud, easing them from the burden of having to deploy and manage these resources themselves. In this work, we focus on issues related to scheduling scientific workloads on virtualized environments. We build upon the knowledge base of traditional parallel job scheduling to address the specific case of medical applications while harnessing the benefits afforded by virtualization technology. To this end, we provide the following contributions: (1) An in-depth analysis of the execution characteristics of the target applications when run in virtualized environments. (2) A performance prediction methodology applicable to the target environment. (3) A scheduling algorithm that harnesses application knowledge and virtualization-related benefits to provide strong scheduling performance and quality of service guarantees. In the process of addressing these pertinent issues for our target user base (i.e. medical professionals and researchers), we provide insight that benefits a large community of scientific application users in industry and academia. Our execution time prediction and scheduling methodologies are implemented and evaluated on a real system running popular scientific applications. We find that we are able to predict the execution time of a number of these applications with an average error of 15%. Our scheduling methodology, which is tested with medical image processing workloads, is compared to that of two baseline scheduling solutions and we find that it outperforms them in terms of both the number of jobs processed and resource utilization by 20–30%, without violating any deadlines. We conclude that our solution is a viable approach to supporting the computational needs of medical users, even if the cloud computing paradigm is not widely adopted in its current form.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Acknowledgments The authors acknowledge the support from Engineering and Physical Sciences Research Council, grant number EP/M002322/1. The authors would also like to thank Numerical Analysis Group at the Rutherford Appleton Laboratory for their FORTRAN HSL packages (HSL, a collection of Fortran codes for large-scale scientific computation. See http://www.hsl.rl.ac.uk/).

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Performing experiments on small-scale quantum computers is certainly a challenging endeavor. Many parameters need to be optimized to achieve high-fidelity operations. This can be done efficiently for operations acting on single qubits, as errors can be fully characterized. For multiqubit operations, though, this is no longer the case, as in the most general case, analyzing the effect of the operation on the system requires a full state tomography for which resources scale exponentially with the system size. Furthermore, in recent experiments, additional electronic levels beyond the two-level system encoding the qubit have been used to enhance the capabilities of quantum-information processors, which additionally increases the number of parameters that need to be controlled. For the optimization of the experimental system for a given task (e.g., a quantum algorithm), one has to find a satisfactory error model and also efficient observables to estimate the parameters of the model. In this manuscript, we demonstrate a method to optimize the encoding procedure for a small quantum error correction code in the presence of unknown but constant phase shifts. The method, which we implement here on a small-scale linear ion-trap quantum computer, is readily applicable to other AMO platforms for quantum-information processing.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Light rainfall is the baseline input to the annual water budget in mountainous landscapes through the tropics and at mid-latitudes. In the Southern Appalachians, the contribution from light rainfall ranges from 50-60% during wet years to 80-90% during dry years, with convective activity and tropical cyclone input providing most of the interannual variability. The Southern Appalachians is a region characterized by rich biodiversity that is vulnerable to land use/land cover changes due to its proximity to a rapidly growing population. Persistent near surface moisture and associated microclimates observed in this region has been well documented since the colonization of the area in terms of species health, fire frequency, and overall biodiversity. The overarching objective of this research is to elucidate the microphysics of light rainfall and the dynamics of low level moisture in the inner region of the Southern Appalachians during the warm season, with a focus on orographically mediated processes. The overarching research hypothesis is that physical processes leading to and governing the life cycle of orographic fog, low level clouds, and precipitation, and their interactions, are strongly tied to landform, land cover, and the diurnal cycles of flow patterns, radiative forcing, and surface fluxes at the ridge-valley scale. The following science questions will be addressed specifically: 1) How do orographic clouds and fog affect the hydrometeorological regime from event to annual scale and as a function of terrain characteristics and land cover?; 2) What are the source areas, governing processes, and relevant time-scales of near surface moisture convergence patterns in the region?; and 3) What are the four dimensional microphysical and dynamical characteristics, including variability and controlling factors and processes, of fog and light rainfall? The research was conducted with two major components: 1) ground-based high-quality observations using multi-sensor platforms and 2) interpretive numerical modeling guided by the analysis of the in situ data collection. Findings illuminate a high level of spatial – down to the ridge scale - and temporal – from event to annual scale - heterogeneity in observations, and a significant impact on the hydrological regime as a result of seeder-feeder interactions among fog, low level clouds, and stratiform rainfall that enhance coalescence efficiency and lead to significantly higher rainfall rates at the land surface. Specifically, results show that enhancement of an event up to one order of magnitude in short-term accumulation can occur as a result of concurrent fog presence. Results also show that events are modulated strongly by terrain characteristics including elevation, slope, geometry, and land cover. These factors produce interactions between highly localized flows and gradients of temperature and moisture with larger scale circulations. Resulting observations of DSD and rainfall patterns are stratified by region and altitude and exhibit clear diurnal and seasonal cycles.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The coupling of mechanical stress fields in polymers to covalent chemistry (polymer mechanochemistry) has provided access to previously unattainable chemical reactions and polymer transformations. In the bulk, mechanochemical activation has been used as the basis for new classes of stress-responsive polymers that demonstrate stress/strain sensing, shear-induced intermolecular reactivity for molecular level remodeling and self-strengthening, and the release of acids and other small molecules that are potentially capable of triggering further chemical response. The potential utility of polymer mechanochemistry in functional materials is limited, however, by the fact that to date, all reported covalent activation in the bulk occurs in concert with plastic yield and deformation, so that the structure of the activated object is vastly different from its nascent form. Mechanochemically activated materials have thus been limited to “single use” demonstrations, rather than as multi-functional materials for structural and/or device applications. Here, we report that filled polydimethylsiloxane (PDMS) elastomers provide a robust elastic substrate into which mechanophores can be embedded and activated under conditions from which the sample regains its original shape and properties. Fabrication is straightforward and easily accessible, providing access for the first time to objects and devices that either release or reversibly activate chemical functionality over hundreds of loading cycles.

While the mechanically accelerated ring-opening reaction of spiropyran to merocyanine and associated color change provides a useful method by which to image the molecular scale stress/strain distribution within a polymer, the magnitude of the forces necessary for activation had yet to be quantified. Here, we report single molecule force spectroscopy studies of two spiropyran isomers. Ring opening on the timescale of tens of milliseconds is found to require forces of ~240 pN, well below that of previously characterized covalent mechanophores. The lower threshold force is a combination of a low force-free activation energy and the fact that the change in rate with force (activation length) of each isomer is greater than that inferred in other systems. Importantly, quantifying the magnitude of forces required to activate individual spiropyran-based force-probes enables the probe behave as a “scout” of molecular forces in materials; the observed behavior of which can be extrapolated to predict the reactivity of potential mechanophores within a given material and deformation.

We subsequently translated the design platform to existing dynamic soft technologies to fabricate the first mechanochemically responsive devices; first, by remotely inducing dielectric patterning of an elastic substrate to produce assorted fluorescent patterns in concert with topological changes; and second, by adopting a soft robotic platform to produce a color change from the strains inherent to pneumatically actuated robotic motion. Shown herein, covalent polymer mechanochemistry provides a viable mechanism to convert the same mechanical potential energy used for actuation into value-added, constructive covalent chemical responses. The color change associated with actuation suggests opportunities for not only new color changing or camouflaging strategies, but also the possibility for simultaneous activation of latent chemistry (e.g., release of small molecules, change in mechanical properties, activation of catalysts, etc.) in soft robots. In addition, mechanochromic stress mapping in a functional actuating device might provide a useful design and optimization tool, revealing spatial and temporal force evolution within the actuator in a way that might also be coupled to feedback loops that allow autonomous, self-regulation of activity.

In the future, both the specific material and the general approach should be useful in enriching the responsive functionality of soft elastomeric materials and devices. We anticipate the development of new mechanophores that, like the materials, are reversibly and repeatedly activated, expanding the capabilities of soft, active devices and further permitting dynamic control over chemical reactivity that is otherwise inaccessible, each in response to a single remote signal.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

We investigate the performance of dual-hop two-way amplify-and-forward (AF) relaying in the presence of inphase and quadrature-phase imbalance (IQI) at the relay node. In particular, the effective signal-to-interference-plus-noise ratio (SINR) at both sources is derived. These SINRs are used to design an instantaneous power allocation scheme, which maximizes the minimum SINR of the two sources under a total transmit power constraint. The solution to this optimization problem is analytically determined and used to evaluate the outage probability (OP) of the considered two-way AF relaying system. Both analytical and numerical results show that IQI can create fundamental performance limits on two-way relaying, which cannot be avoided by simply improving the channel conditions.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Wireless sensor networks (WSNs) differ from conventional distributed systems in many aspects. The resource limitation of sensor nodes, the ad-hoc communication and topology of the network, coupled with an unpredictable deployment environment are difficult non-functional constraints that must be carefully taken into account when developing software systems for a WSN. Thus, more research needs to be done on designing, implementing and maintaining software for WSNs. This thesis aims to contribute to research being done in this area by presenting an approach to WSN application development that will improve the reusability, flexibility, and maintainability of the software. Firstly, we present a programming model and software architecture aimed at describing WSN applications, independently of the underlying operating system and hardware. The proposed architecture is described and realized using the Model-Driven Architecture (MDA) standard in order to achieve satisfactory levels of encapsulation and abstraction when programming sensor nodes. Besides, we study different non-functional constrains of WSN application and propose two approaches to optimize the application to satisfy these constrains. A real prototype framework was built to demonstrate the developed solutions in the thesis. The framework implemented the programming model and the multi-layered software architecture as components. A graphical interface, code generation components and supporting tools were also included to help developers design, implement, optimize, and test the WSN software. Finally, we evaluate and critically assess the proposed concepts. Two case studies are provided to support the evaluation. The first case study, a framework evaluation, is designed to assess the ease at which novice and intermediate users can develop correct and power efficient WSN applications, the portability level achieved by developing applications at a high-level of abstraction, and the estimated overhead due to usage of the framework in terms of the footprint and executable code size of the application. In the second case study, we discuss the design, implementation and optimization of a real-world application named TempSense, where a sensor network is used to monitor the temperature within an area.