991 resultados para Paradigm management


Relevância:

30.00% 30.00%

Publicador:

Resumo:

This is the first study to adopt a configurational paradigm in an investigation of strategic management accounting (SMA) adoption. The study examines the alignment and effectiveness of strategic choice and strategic management accounting (SMA) system design configurations. Six configurations were derived empirically by deploying a cluster analysis of data collected from a sample of 193 large Slovenian companies. The first four clusters appear to provide some support for the central configurational proposition that higher levels of vertical and horizontal configurational alignments are associated with higher levels of performance. Evidence that contradicts the theory is also apparent, however, as the remaining two clusters exhibit high degrees of SMA vertical and horizontal alignment, but low performance levels. A particular contribution of the paper concerns its demonstration of the way that the configurational paradigm can be operationalised to examine management accounting phenomena and the nature of management accounting insights that can derive from applying the approach.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Report for the scientific sojourn at the University of California at Berkeley, USA, from september 2007 until july 2008. Communities of Learning Practice is an innovative paradigm focused on providing appropriate technological support to both formal and especially informal learning groups who are chiefly formed by non-technical people and who lack of the necessary resources to acquire such systems. Typically, students who are often separated by geography and/or time have the need to meet each other after classes in small study groups to carry out specific learning activities assigned during the formal learning process. However, the lack of suitable and available groupware applications makes it difficult for these groups of learners to collaborate and achieve their specific learning goals. In addition, the lack of democratic decision-making mechanisms is a main handicap to substitute the central authority of knowledge presented in formal learning.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

While mobile technologies can provide great personalized services for mobile users, they also threaten their privacy. Such personalization-privacy paradox are particularly salient for context aware technology based mobile applications where user's behaviors, movement and habits can be associated with a consumer's personal identity. In this thesis, I studied the privacy issues in the mobile context, particularly focus on an adaptive privacy management system design for context-aware mobile devices, and explore the role of personalization and control over user's personal data. This allowed me to make multiple contributions, both theoretical and practical. In the theoretical world, I propose and prototype an adaptive Single-Sign On solution that use user's context information to protect user's private information for smartphone. To validate this solution, I first proved that user's context is a unique user identifier and context awareness technology can increase user's perceived ease of use of the system and service provider's authentication security. I then followed a design science research paradigm and implemented this solution into a mobile application called "Privacy Manager". I evaluated the utility by several focus group interviews, and overall the proposed solution fulfilled the expected function and users expressed their intentions to use this application. To better understand the personalization-privacy paradox, I built on the theoretical foundations of privacy calculus and technology acceptance model to conceptualize the theory of users' mobile privacy management. I also examined the role of personalization and control ability on my model and how these two elements interact with privacy calculus and mobile technology model. In the practical realm, this thesis contributes to the understanding of the tradeoff between the benefit of personalized services and user's privacy concerns it may cause. By pointing out new opportunities to rethink how user's context information can protect private data, it also suggests new elements for privacy related business models.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

INTRODUCTION: The management of large lesions of the skull base, such as vestibular schwannomas (VS) is challenging. Microsurgery remains the main treatment option. Combined approaches (planned subtotal resection followed by gamma knife surgery (GKS) for residual tumor long-term control) are being increasingly considered to reduce the risk of neurological deficits following complete resection. The current study aims to prospectively evaluate the safety-efficacy of combined approach in patients with large VS. MATERIALS AND METHODS: We present our experience with planned subtotal resection followed by gamma knife surgery (GKS) in a consecutive a series of 20 patients with large vestibular schwannomas, treated between 2009 and 2014 in Lausanne University Hospital, Switzerland. Clinical and radiological data and audiograms were prospectively collected for all patients, before and after surgery, before and after GKS, at regular intervals, in dedicated case-report forms. Additionally, for GKS, dose-planning parameters were registered. RESULTS: Twenty patients (6 males and 14 females) with large VS had been treated by this approach. The mean age at the time of surgery was 51.6years (range 34.4-73.4). The mean presurgical diameter was 36.7 (range 26.1-45). The mean presurgical tumor volume was 15.9cm(3) (range 534.9). Three patients (15%) needed a second surgical intervention because of high volume of the tumor remnant considered too large for a safe GKS. The mean follow-up after surgery was 27.2months (range 6-61.3). The timing of GKS was decided on the basis of the residual tumor shape and size following surgery. The mean duration between surgery and GKS was 7.6months (range 413.9, median 6months). The mean tumor volume at the time of GKS was 4.1cm(3) (range 0.5-12.8). The mean prescription isodose volume was 6.3cm(3) (range 0.8-15.5). The mean number of isocenters was 20.4 (range 11-31) and the mean marginal prescription dose was 11.7Gy (range 11-12). We did not have any major complications in our series. Postoperative status showed normal facial nerve function (House-Brackmann grade I) in all patients. Six patients with useful pre-operative hearing (GR class 1) underwent surgery with the aim to preserve cochlear nerve function; of these patients, 5 (83.3%) of them remained in GR class 1 and one (16.7%) lost hearing (GR class 5). Two patients having GR class 3 at baseline remained in the same GR class, but the tonal audiometry improved in one of them during follow-up. Eleven patients (57.8%) were in GR class 5 preoperatively; one patient improved hearing after surgery, passing to GR class 3 postoperatively. Following GKS, there were no new neurological deficits, with facial and hearing function remaining identical to that after surgery. CONCLUSION: Our data suggest that planned subtotal resection followed by GKS has an excellent clinical outcome with respect to retaining facial and cochlear nerve function. This represents a paradigm shift of the treatment goals from a complete tumor excision perspective to that of a surgery designed to preserve neural functions. As long-term results emerge, this approach of a combined treatment (microsurgery and GKS) will most probably become the standard of care in the management of large vestibular schwanomma.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Due to various contexts and processes, forensic science communities may have different approaches, largely influenced by their criminal justice systems. However, forensic science practices share some common characteristics. One is the assurance of a high (scientific) quality within processes and practices. For most crime laboratory directors and forensic science associations, this issue is conditioned by the triangle of quality, which represents the current paradigm of quality assurance in the field. It consists of the implementation of standardization, certification, accreditation, and an evaluation process. It constitutes a clear and sound way to exchange data between laboratories and enables databasing due to standardized methods ensuring reliable and valid results; but it is also a means of defining minimum requirements for practitioners' skills for specific forensic science activities. The control of each of these aspects offers non-forensic science partners the assurance that the entire process has been mastered and is trustworthy. Most of the standards focus on the analysis stage and do not consider pre- and post-laboratory stages, namely, the work achieved at the investigation scene and the evaluation and interpretation of the results, intended for intelligence beneficiaries or for court. Such localized consideration prevents forensic practitioners from identifying where the problems really lie with regard to criminal justice systems. According to a performance-management approach, scientific quality should not be restricted to standardized procedures and controls in forensic science practice. Ensuring high quality also strongly depends on the way a forensic science culture is assimilated (into specific education training and workplaces) and in the way practitioners understand forensic science as a whole.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

OBJECTIVES: To refine the classic definition of, and provide a working definition for, congenital high airway obstruction syndrome (CHAOS) and to discuss the various aspects of long-term airway reconstruction, including the range of laryngeal anomalies and the various techniques for reconstruction. DESIGN: Retrospective chart review. PATIENTS: Four children (age range, 2-8 years) with CHAOS who presented to a single tertiary care children's hospital for pediatric airway reconstruction between 1995 and 2000. CONCLUSIONS: To date, CHAOS remains poorly described in the otolaryngologic literature. We propose the following working definition for pediatric cases of CHAOS: any neonate who needs a surgical airway within 1 hour of birth owing to high upper airway (ie, glottic, subglottic, or upper tracheal) obstruction and who cannot be tracheally intubated other than through a persistent tracheoesophageal fistula. Therefore, CHAOS has 3 possible presentations: (1) complete laryngeal atresia without an esophageal fistula, (2) complete laryngeal atresia with a tracheoesophageal fistula, and (3) near-complete high upper airway obstruction. Management of the airway, particularly in regard to long-term reconstruction, in children with CHAOS is complex and challenging.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The traditional model of learning based on knowledge transfer doesn't promote the acquisition of information-related competencies and development of autonomous learning. More needs to be done to embrace learner-centred approaches, based on constructivism, collaboration and co-operation. This new learning paradigm is aligned with the European Higher Education Area (EHEA) requirements. In this sense, a learning experience based in faculty' librarian collaboration was seen as the best option for promoting student engagement and also a way to increase information-related competences in Open University of Catalonia (UOC) academic context. This case study outlines the benefits of teacher-librarian collaboration in terms of pedagogy innovation, resources management and introduction of open educational resources (OER) in virtual classrooms, Information literacy (IL) training and use of 2.0 tools in teaching. Our faculty-librarian's collaboration aims to provide an example of technology-enhanced learning and demonstrate how working together improves the quality and relevance of educational resources in UOC's virtual classrooms. Under this new approach, while teachers change their role from instructors to facilitators of the learning process and extend their reach to students, libraries acquire an important presence in the academic learning communities.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Enterprise architectures (EA) are considered promising approaches to reduce the complexities of growing information technology (IT) environments while keeping pace with an ever-changing business environment. However, the implementation of enterprise architecture management (EAM) has proven difficult in practice. Many EAM initiatives face severe challenges, as demonstrated by the low usage level of enterprise architecture documentation and enterprise architects' lack of authority regarding enforcing EAM standards and principles. These challenges motivate our research. Based on three field studies, we first analyze EAM implementation issues that arise when EAM is started as a dedicated and isolated initiative. Following a design-oriented paradigm, we then suggest a design theory for architecture-driven IT management (ADRIMA) that may guide organizations to successfully implement EAM. This theory summarizes prescriptive knowledge related to embedding EAM practices, artefacts and roles in the existing IT management processes and organization.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Management of neurocritical care patients is focused on the prevention and treatment of secondary brain injury, i.e. the number of pathophysiological intracerebral (edema, ischemia, energy dysfunction, seizures) and systemic (hyperthermia, disorders of glucose homeostasis) events that occur following the initial insult (stroke, hemorrhage, head trauma, brain anoxia) that may aggravate patient outcome. The current therapeutic paradigm is based on multimodal neuromonitoring, including invasive (intracranial pressure, brain oxygen, cerebral microdialysis) and non-invasive (transcranial doppler, near-infrared spectroscopy, EEG) tools that allows targeted individualized management of acute coma in the early phase. The aim of this review is to describe the utility of multimodal neuromonitoring for the critical care management of acute coma.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

La biologie de la conservation est communément associée à la protection de petites populations menacées d?extinction. Pourtant, il peut également être nécessaire de soumettre à gestion des populations surabondantes ou susceptibles d?une trop grande expansion, dans le but de prévenir les effets néfastes de la surpopulation. Du fait des différences tant quantitatives que qualitatives entre protection des petites populations et contrôle des grandes, il est nécessaire de disposer de modèles et de méthodes distinctes. L?objectif de ce travail a été de développer des modèles prédictifs de la dynamique des grandes populations, ainsi que des logiciels permettant de calculer les paramètres de ces modèles et de tester des scénarios de gestion. Le cas du Bouquetin des Alpes (Capra ibex ibex) - en forte expansion en Suisse depuis sa réintroduction au début du XXème siècle - servit d?exemple. Cette tâche fut accomplie en trois étapes : En premier lieu, un modèle de dynamique locale, spécifique au Bouquetin, fut développé : le modèle sous-jacent - structuré en classes d?âge et de sexe - est basé sur une matrice de Leslie à laquelle ont été ajoutées la densité-dépendance, la stochasticité environnementale et la chasse de régulation. Ce modèle fut implémenté dans un logiciel d?aide à la gestion - nommé SIM-Ibex - permettant la maintenance de données de recensements, l?estimation automatisée des paramètres, ainsi que l?ajustement et la simulation de stratégies de régulation. Mais la dynamique d?une population est influencée non seulement par des facteurs démographiques, mais aussi par la dispersion et la colonisation de nouveaux espaces. Il est donc nécessaire de pouvoir modéliser tant la qualité de l?habitat que les obstacles à la dispersion. Une collection de logiciels - nommée Biomapper - fut donc développée. Son module central est basé sur l?Analyse Factorielle de la Niche Ecologique (ENFA) dont le principe est de calculer des facteurs de marginalité et de spécialisation de la niche écologique à partir de prédicteurs environnementaux et de données d?observation de l?espèce. Tous les modules de Biomapper sont liés aux Systèmes d?Information Géographiques (SIG) ; ils couvrent toutes les opérations d?importation des données, préparation des prédicteurs, ENFA et calcul de la carte de qualité d?habitat, validation et traitement des résultats ; un module permet également de cartographier les barrières et les corridors de dispersion. Le domaine d?application de l?ENFA fut exploré par le biais d?une distribution d?espèce virtuelle. La comparaison à une méthode couramment utilisée pour construire des cartes de qualité d?habitat, le Modèle Linéaire Généralisé (GLM), montra qu?elle était particulièrement adaptée pour les espèces cryptiques ou en cours d?expansion. Les informations sur la démographie et le paysage furent finalement fusionnées en un modèle global. Une approche basée sur un automate cellulaire fut choisie, tant pour satisfaire aux contraintes du réalisme de la modélisation du paysage qu?à celles imposées par les grandes populations : la zone d?étude est modélisée par un pavage de cellules hexagonales, chacune caractérisée par des propriétés - une capacité de soutien et six taux d?imperméabilité quantifiant les échanges entre cellules adjacentes - et une variable, la densité de la population. Cette dernière varie en fonction de la reproduction et de la survie locale, ainsi que de la dispersion, sous l?influence de la densité-dépendance et de la stochasticité. Un logiciel - nommé HexaSpace - fut développé pour accomplir deux fonctions : 1° Calibrer l?automate sur la base de modèles de dynamique (par ex. calculés par SIM-Ibex) et d?une carte de qualité d?habitat (par ex. calculée par Biomapper). 2° Faire tourner des simulations. Il permet d?étudier l?expansion d?une espèce envahisseuse dans un paysage complexe composé de zones de qualité diverses et comportant des obstacles à la dispersion. Ce modèle fut appliqué à l?histoire de la réintroduction du Bouquetin dans les Alpes bernoises (Suisse). SIM-Ibex est actuellement utilisé par les gestionnaires de la faune et par les inspecteurs du gouvernement pour préparer et contrôler les plans de tir. Biomapper a été appliqué à plusieurs espèces (tant végétales qu?animales) à travers le Monde. De même, même si HexaSpace fut initialement conçu pour des espèces animales terrestres, il pourrait aisément être étndu à la propagation de plantes ou à la dispersion d?animaux volants. Ces logiciels étant conçus pour, à partir de données brutes, construire un modèle réaliste complexe, et du fait qu?ils sont dotés d?une interface d?utilisation intuitive, ils sont susceptibles de nombreuses applications en biologie de la conservation. En outre, ces approches peuvent également s?appliquer à des questions théoriques dans les domaines de l?écologie des populations et du paysage.<br/><br/>Conservation biology is commonly associated to small and endangered population protection. Nevertheless, large or potentially large populations may also need human management to prevent negative effects of overpopulation. As there are both qualitative and quantitative differences between small population protection and large population controlling, distinct methods and models are needed. The aim of this work was to develop theoretical models to predict large population dynamics, as well as computer tools to assess the parameters of these models and to test management scenarios. The alpine Ibex (Capra ibex ibex) - which experienced a spectacular increase since its reintroduction in Switzerland at the beginning of the 20th century - was used as paradigm species. This task was achieved in three steps: A local population dynamics model was first developed specifically for Ibex: the underlying age- and sex-structured model is based on a Leslie matrix approach with addition of density-dependence, environmental stochasticity and culling. This model was implemented into a management-support software - named SIM-Ibex - allowing census data maintenance, parameter automated assessment and culling strategies tuning and simulating. However population dynamics is driven not only by demographic factors, but also by dispersal and colonisation of new areas. Habitat suitability and obstacles modelling had therefore to be addressed. Thus, a software package - named Biomapper - was developed. Its central module is based on the Ecological Niche Factor Analysis (ENFA) whose principle is to compute niche marginality and specialisation factors from a set of environmental predictors and species presence data. All Biomapper modules are linked to Geographic Information Systems (GIS); they cover all operations of data importation, predictor preparation, ENFA and habitat suitability map computation, results validation and further processing; a module also allows mapping of dispersal barriers and corridors. ENFA application domain was then explored by means of a simulated species distribution. It was compared to a common habitat suitability assessing method, the Generalised Linear Model (GLM), and was proven better suited for spreading or cryptic species. Demography and landscape informations were finally merged into a global model. To cope with landscape realism and technical constraints of large population modelling, a cellular automaton approach was chosen: the study area is modelled by a lattice of hexagonal cells, each one characterised by a few fixed properties - a carrying capacity and six impermeability rates quantifying exchanges between adjacent cells - and one variable, population density. The later varies according to local reproduction/survival and dispersal dynamics, modified by density-dependence and stochasticity. A software - named HexaSpace - was developed, which achieves two functions: 1° Calibrating the automaton on the base of local population dynamics models (e.g., computed by SIM-Ibex) and a habitat suitability map (e.g. computed by Biomapper). 2° Running simulations. It allows studying the spreading of an invading species across a complex landscape made of variously suitable areas and dispersal barriers. This model was applied to the history of Ibex reintroduction in Bernese Alps (Switzerland). SIM-Ibex is now used by governmental wildlife managers to prepare and verify culling plans. Biomapper has been applied to several species (both plants and animals) all around the World. In the same way, whilst HexaSpace was originally designed for terrestrial animal species, it could be easily extended to model plant propagation or flying animals dispersal. As these softwares were designed to proceed from low-level data to build a complex realistic model and as they benefit from an intuitive user-interface, they may have many conservation applications. Moreover, theoretical questions in the fields of population and landscape ecology might also be addressed by these approaches.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Teollisuuden palveluiden on huomattu olevan potentiaalinen lisätulojen lähde. Teollisuuden palveluiden dynaamisessa maailmassa räätälöinti ja kyky toimia nopeasti ovat kriittisiä asiakastyytyväisyyden ja kilpailuedun luomisprosessin osia. Toimitusketjussa käytetyn ajan lyhentämisellä voidaan saavuttaa sekä paremmat vasteajat, että alhaisemmat kokonaiskustannukset. Tutkielman tavoitteena on kuvata teollisuuden palveluiden dynaamista ympäristöä: asiakastarvetta, sekä mahdollisuuksia kaventaa pyydetyn ja saavutetun toimitusajan välistä eroa. Tämä toteutetaan pääosin strategisen toimitusajan hallinnan keinoin. Langattomien tietoliikenneverkkojen operaattorit haluavat vähentää ydinosaamiseensa kuulumatomiin toimintoihin, kuten ylläpitoon sitoutuneita pääomia. Tutkielman case osiossa varaosapalvelujen toimitusketjun kysyntä-, materiaali- ja informaatiovirtoja analysoidaan niin kvalitatiivisten haastatteluiden, sisäisten dokumenttien, kuin kvantitatiivisten tilastollisten menetelmienkin avulla. Löydöksiä peilataan vallitsevaa toimitusketjun ja ajanhallinnan paradigmaa vasten. Tulokset osoittavat, että vahvan palvelukulttuurin omaksuminen ja kokonaisvaltainen toimitusketjun tehokkuuden mittaaminen ovat ajanhallinnan lähtökohtia teollisuuden palveluissa.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Internet-palvelujen määrä kasvaa jatkuvasti. Henkilöllä on yleensä yksi sähköinen identiteetti jokaisessa käyttämässään palvelussa. Autentikointitunnusten turvallinen säilytys käy yhä vaikeammaksi, kun niitä kertyy yhdet jokaisesta uudesta palvelurekisteröitymisestä. Tämä diplomityö tarkastelee ongelmaa ja ratkaisuja sekä palvelulähtöisestä että teknisestä näkökulmasta. Palvelulähtöisen identiteetinhallinnan liiketoimintakonsepti ja toteutustekniikat – kuten single sign-on (SSO) ja Security Assertion Markup Language (SAML) – käydään läpi karkeiden esimerkkien avulla sekä tutustuen Nokia Account -hankkeessa tuotetun ratkaisun konseptiin ja teknisiin yksityiskohtiin. Nokia Account -palvelun ensimmäisen version toteutusta analysoidaan lopuksi identiteetinhallintapalveluiden suunnitteluperiaatteita ja vaatimuksia vasten.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The objective of this study is to explore how the Open Innovation paradigm is applied in by small and medium-size enterprises in Russia. The focus of the study is to understand how the processes of research and development and commercialization proceed in these kind of companies and to which extent they apply open innovation principles. Russian leadership makes certain steps for transition from the export of raw materials to an innovative model of economic growth. The research aims to disclose actual impact of these attempts. The closed innovation model and the erosion factors which lead to the destruction of an old one and emergence of new model are described. Features of open innovation implementation and intellectual property rights protection in small and medium enterprises are presented. To achieve the objective, a qualitative case study approach was chosen. Research includes facts and figures, views and opinions of management of studied companies related to innovation process in the company and in Russia in general. The research depicts the features of Open Innovation implementation by SMEs in Russia. A large number of research centers with necessary equipment and qualified personnel allow case companies to use external R&D effectively. They cooperate actively with research institutes, universities and laboratories. Thus, they apply inbound Open Innovation. On the contrary, lack of venture capital, low demand for technologies within the domestic market and weak protection of intellectual property limit the external paths to new markets. Licensing-out and creation of spin-off are isolated cases. Therefore, outbound Open Innovation is not a regular practice.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

One of the main challenges in Software Engineering is to cope with the transition from an industry based on software as a product to software as a service. The field of Software Engineering should provide the necessary methods and tools to develop and deploy new cost-efficient and scalable digital services. In this thesis, we focus on deployment platforms to ensure cost-efficient scalability of multi-tier web applications and on-demand video transcoding service for different types of load conditions. Infrastructure as a Service (IaaS) clouds provide Virtual Machines (VMs) under the pay-per-use business model. Dynamically provisioning VMs on demand allows service providers to cope with fluctuations on the number of service users. However, VM provisioning must be done carefully, because over-provisioning results in an increased operational cost, while underprovisioning leads to a subpar service. Therefore, our main focus in this thesis is on cost-efficient VM provisioning for multi-tier web applications and on-demand video transcoding. Moreover, to prevent provisioned VMs from becoming overloaded, we augment VM provisioning with an admission control mechanism. Similarly, to ensure efficient use of provisioned VMs, web applications on the under-utilized VMs are consolidated periodically. Thus, the main problem that we address is cost-efficient VM provisioning augmented with server consolidation and admission control on the provisioned VMs. We seek solutions for two types of applications: multi-tier web applications that follow the request-response paradigm and on-demand video transcoding that is based on video streams with soft realtime constraints. Our first contribution is a cost-efficient VM provisioning approach for multi-tier web applications. The proposed approach comprises two subapproaches: a reactive VM provisioning approach called ARVUE and a hybrid reactive-proactive VM provisioning approach called Cost-efficient Resource Allocation for Multiple web applications with Proactive scaling. Our second contribution is a prediction-based VM provisioning approach for on-demand video transcoding in the cloud. Moreover, to prevent virtualized servers from becoming overloaded, the proposed VM provisioning approaches are augmented with admission control approaches. Therefore, our third contribution is a session-based admission control approach for multi-tier web applications called adaptive Admission Control for Virtualized Application Servers. Similarly, the fourth contribution in this thesis is a stream-based admission control and scheduling approach for on-demand video transcoding called Stream-Based Admission Control and Scheduling. Our fifth contribution is a computation and storage trade-o strategy for cost-efficient video transcoding in cloud computing. Finally, the sixth and the last contribution is a web application consolidation approach, which uses Ant Colony System to minimize the under-utilization of the virtualized application servers.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This thesis presents a novel design paradigm, called Virtual Runtime Application Partitions (VRAP), to judiciously utilize the on-chip resources. As the dark silicon era approaches, where the power considerations will allow only a fraction chip to be powered on, judicious resource management will become a key consideration in future designs. Most of the works on resource management treat only the physical components (i.e. computation, communication, and memory blocks) as resources and manipulate the component to application mapping to optimize various parameters (e.g. energy efficiency). To further enhance the optimization potential, in addition to the physical resources we propose to manipulate abstract resources (i.e. voltage/frequency operating point, the fault-tolerance strength, the degree of parallelism, and the configuration architecture). The proposed framework (i.e. VRAP) encapsulates methods, algorithms, and hardware blocks to provide each application with the abstract resources tailored to its needs. To test the efficacy of this concept, we have developed three distinct self adaptive environments: (i) Private Operating Environment (POE), (ii) Private Reliability Environment (PRE), and (iii) Private Configuration Environment (PCE) that collectively ensure that each application meets its deadlines using minimal platform resources. In this work several novel architectural enhancements, algorithms and policies are presented to realize the virtual runtime application partitions efficiently. Considering the future design trends, we have chosen Coarse Grained Reconfigurable Architectures (CGRAs) and Network on Chips (NoCs) to test the feasibility of our approach. Specifically, we have chosen Dynamically Reconfigurable Resource Array (DRRA) and McNoC as the representative CGRA and NoC platforms. The proposed techniques are compared and evaluated using a variety of quantitative experiments. Synthesis and simulation results demonstrate VRAP significantly enhances the energy and power efficiency compared to state of the art.