883 resultados para pacs: information technolgy applications


Relevância:

30.00% 30.00%

Publicador:

Resumo:

Discrete Markov random field models provide a natural framework for representing images or spatial datasets. They model the spatial association present while providing a convenient Markovian dependency structure and strong edge-preservation properties. However, parameter estimation for discrete Markov random field models is difficult due to the complex form of the associated normalizing constant for the likelihood function. For large lattices, the reduced dependence approximation to the normalizing constant is based on the concept of performing computationally efficient and feasible forward recursions on smaller sublattices which are then suitably combined to estimate the constant for the whole lattice. We present an efficient computational extension of the forward recursion approach for the autologistic model to lattices that have an irregularly shaped boundary and which may contain regions with no data; these lattices are typical in applications. Consequently, we also extend the reduced dependence approximation to these scenarios enabling us to implement a practical and efficient non-simulation based approach for spatial data analysis within the variational Bayesian framework. The methodology is illustrated through application to simulated data and example images. The supplemental materials include our C++ source code for computing the approximate normalizing constant and simulation studies.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Organic solar cells based on bulk heterojunction between a conductive polymer and a carbon nanostructure offer potential advantages compared to conventional inorganic cells. Low cost, light weight, flexibility and high peak power per unit weight are all features that can be considered a reality for organic photovoltaics. Although polymer/carbon nanotubes solar cells have been proposed, only low power conversion efficiencies have been reached without addressing the mechanisms responsible for this poor performance. The purpose of this work is therefore to investigate the basic interaction between carbon nanotubes and poly(3-hexylthiophene) in order to demonstrate how this interaction affects the performance of photovoltaic devices. The outcomes of this study are the contributions made to the knowledge of the phenomena explaining the behaviour of electronic devices based on carbon nanotubes and poly(3-hexylthiophene). In this PhD, polymer thin films with the inclusion of uniformly distributed carbon nanotubes were deposited from solution and characterised. The bulk properties of the composites were studied with microscopy and spectroscopy techniques to provide evidence of higher degrees of polymer order when interacting with carbon nanotubes. Although bulk investigation techniques provided useful information about the interaction between the polymer and the nanotubes, clear evidence of the phenomena affecting the heterojunction formed between the two species was investigated at nanoscale. Identifying chirality-driven polymer assisted assembly on the carbon nanotube surface was one of the major achievements of this study. Moreover, the analysis of the electrical behaviour of the heterojunction between the polymer and the nanotube highlighted the charge transfer responsible for the low performance of photovoltaic devices. Polymer and carbon nanotube composite-based devices were fabricated and characterised in order to study their electronic properties. The carbon nanotube introduction in the polymer matrix evidenced a strong electrical conductivity enhancement but also a lower photoconductivity response. Moreover, the extension of pristine polymer device characterisation models to composites based devices evidenced the conduction mechanisms related to nanotubes. Finally, the introduction of carbon nanotubes in the polymer matrix was demonstrated to improve the pristine polymer solar cell performance and the spectral response even though the power conversion efficiency is still too low.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

To cover wide range of pulsed power applications, this paper proposes a modularity concept to improve the performance and flexibility of the pulsed power supply. The proposed scheme utilizes the advantage of parallel and series configurations of flyback modules in obtaining high-voltage levels with fast rise time (dv/dt). Prototypes were implemented using 600-V insulated-gate bipolar transistor (IGBT) switches to generate up to 4-kV output pulses with 1-kHz repetition rate for experimentation. To assess the proposed modular approach for higher number of the modules, prototypes were implemented using 1700-V IGBTs switches, based on ten-series modules, and tested up to 20 kV. Conducted experimental results verified the effectiveness of the proposed method

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This study proceeds from a central interest in the importance of systematically evaluating operational large-scale integrated information systems (IS) in organisations. The study is conducted within the IS-Impact Research Track at Queensland University of Technology (QUT). The goal of the IS-Impact Track is, "to develop the most widely employed model for benchmarking information systems in organizations for the joint benefit of both research and practice" (Gable et al, 2009). The track espouses programmatic research having the principles of incrementalism, tenacity, holism and generalisability through replication and extension research strategies. Track efforts have yielded the bicameral IS-Impact measurement model; the ‘impact’ half includes Organisational-Impact and Individual-Impact dimensions; the ‘quality’ half includes System-Quality and Information-Quality dimensions. Akin to Gregor’s (2006) analytic theory, the ISImpact model is conceptualised as a formative, multidimensional index and is defined as "a measure at a point in time, of the stream of net benefits from the IS, to date and anticipated, as perceived by all key-user-groups" (Gable et al., 2008, p: 381). The study adopts the IS-Impact model (Gable, et al., 2008) as its core theory base. Prior work within the IS-Impact track has been consciously constrained to Financial IS for their homogeneity. This study adopts a context-extension strategy (Berthon et al., 2002) with the aim "to further validate and extend the IS-Impact measurement model in a new context - i.e. a different IS - Human Resources (HR)". The overarching research question is: "How can the impacts of large-scale integrated HR applications be effectively and efficiently benchmarked?" This managerial question (Cooper & Emory, 1995) decomposes into two more specific research questions – In the new HR context: (RQ1): "Is the IS-Impact model complete?" (RQ2): "Is the ISImpact model valid as a 1st-order formative, 2nd-order formative multidimensional construct?" The study adhered to the two-phase approach of Gable et al. (2008) to hypothesise and validate a measurement model. The initial ‘exploratory phase’ employed a zero base qualitative approach to re-instantiating the IS-Impact model in the HR context. The subsequent ‘confirmatory phase’ sought to validate the resultant hypothesised measurement model against newly gathered quantitative data. The unit of analysis for the study is the application, ‘ALESCO’, an integrated large-scale HR application implemented at Queensland University of Technology (QUT), a large Australian university (with approximately 40,000 students and 5000 staff). Target respondents of both study phases were ALESCO key-user-groups: strategic users, management users, operational users and technical users, who directly use ALESCO or its outputs. An open-ended, qualitative survey was employed in the exploratory phase, with the objective of exploring the completeness and applicability of the IS-Impact model’s dimensions and measures in the new context, and to conceptualise any resultant model changes to be operationalised in the confirmatory phase. Responses from 134 ALESCO users to the main survey question, "What do you consider have been the impacts of the ALESCO (HR) system in your division/department since its implementation?" were decomposed into 425 ‘impact citations.’ Citation mapping using a deductive (top-down) content analysis approach instantiated all dimensions and measures of the IS-Impact model, evidencing its content validity in the new context. Seeking to probe additional (perhaps negative) impacts; the survey included the additional open question "In your opinion, what can be done better to improve the ALESCO (HR) system?" Responses to this question decomposed into a further 107 citations which in the main did not map to IS-Impact, but rather coalesced around the concept of IS-Support. Deductively drawing from relevant literature, and working inductively from the unmapped citations, the new ‘IS-Support’ construct, including the four formative dimensions (i) training, (ii) documentation, (iii) assistance, and (iv) authorisation (each having reflective measures), was defined as: "a measure at a point in time, of the support, the [HR] information system key-user groups receive to increase their capabilities in utilising the system." Thus, a further goal of the study became validation of the IS-Support construct, suggesting the research question (RQ3): "Is IS-Support valid as a 1st-order reflective, 2nd-order formative multidimensional construct?" With the aim of validating IS-Impact within its nomological net (identification through structural relations), as in prior work, Satisfaction was hypothesised as its immediate consequence. The IS-Support construct having derived from a question intended to probe IS-Impacts, too was hypothesised as antecedent to Satisfaction, thereby suggesting the research question (RQ4): "What is the relative contribution of IS-Impact and IS-Support to Satisfaction?" With the goal of testing the above research questions, IS-Impact, IS-Support and Satisfaction were operationalised in a quantitative survey instrument. Partial least squares (PLS) structural equation modelling employing 221 valid responses largely evidenced the validity of the commencing IS-Impact model in the HR context. ISSupport too was validated as operationalised (including 11 reflective measures of its 4 formative dimensions). IS-Support alone explained 36% of Satisfaction; IS-Impact alone 70%; in combination both explaining 71% with virtually all influence of ISSupport subsumed by IS-Impact. Key study contributions to research include: (1) validation of IS-Impact in the HR context, (2) validation of a newly conceptualised IS-Support construct as important antecedent of Satisfaction, and (3) validation of the redundancy of IS-Support when gauging IS-Impact. The study also makes valuable contributions to practice, the research track and the sponsoring organisation.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Structural health monitoring (SHM) refers to the procedure used to assess the condition of structures so that their performance can be monitored and any damage can be detected early. Early detection of damage and appropriate retrofitting will aid in preventing failure of the structure and save money spent on maintenance or replacement and ensure the structure operates safely and efficiently during its whole intended life. Though visual inspection and other techniques such as vibration based ones are available for SHM of structures such as bridges, the use of acoustic emission (AE) technique is an attractive option and is increasing in use. AE waves are high frequency stress waves generated by rapid release of energy from localised sources within a material, such as crack initiation and growth. AE technique involves recording these waves by means of sensors attached on the surface and then analysing the signals to extract information about the nature of the source. High sensitivity to crack growth, ability to locate source, passive nature (no need to supply energy from outside, but energy from damage source itself is utilised) and possibility to perform real time monitoring (detecting crack as it occurs or grows) are some of the attractive features of AE technique. In spite of these advantages, challenges still exist in using AE technique for monitoring applications, especially in the area of analysis of recorded AE data, as large volumes of data are usually generated during monitoring. The need for effective data analysis can be linked with three main aims of monitoring: (a) accurately locating the source of damage; (b) identifying and discriminating signals from different sources of acoustic emission and (c) quantifying the level of damage of AE source for severity assessment. In AE technique, the location of the emission source is usually calculated using the times of arrival and velocities of the AE signals recorded by a number of sensors. But complications arise as AE waves can travel in a structure in a number of different modes that have different velocities and frequencies. Hence, to accurately locate a source it is necessary to identify the modes recorded by the sensors. This study has proposed and tested the use of time-frequency analysis tools such as short time Fourier transform to identify the modes and the use of the velocities of these modes to achieve very accurate results. Further, this study has explored the possibility of reducing the number of sensors needed for data capture by using the velocities of modes captured by a single sensor for source localization. A major problem in practical use of AE technique is the presence of sources of AE other than crack related, such as rubbing and impacts between different components of a structure. These spurious AE signals often mask the signals from the crack activity; hence discrimination of signals to identify the sources is very important. This work developed a model that uses different signal processing tools such as cross-correlation, magnitude squared coherence and energy distribution in different frequency bands as well as modal analysis (comparing amplitudes of identified modes) for accurately differentiating signals from different simulated AE sources. Quantification tools to assess the severity of the damage sources are highly desirable in practical applications. Though different damage quantification methods have been proposed in AE technique, not all have achieved universal approval or have been approved as suitable for all situations. The b-value analysis, which involves the study of distribution of amplitudes of AE signals, and its modified form (known as improved b-value analysis), was investigated for suitability for damage quantification purposes in ductile materials such as steel. This was found to give encouraging results for analysis of data from laboratory, thereby extending the possibility of its use for real life structures. By addressing these primary issues, it is believed that this thesis has helped improve the effectiveness of AE technique for structural health monitoring of civil infrastructures such as bridges.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Digital information that is place- and time-specific, is increasingly becoming available on all aspects of the urban landscape. People (cf. the Social Web), places (cf. the Geo Web), and physical objects (cf. ubiquitous computing, the Internet of Things) are increasingly infused with sensors, actuators, and tagged with a wealth of digital information. Urban informatics research explores these emerging digital layers of the city at the intersection of people, place and technology. However, little is known about the challenges and new opportunities that these digital layers may offer to road users driving through today’s mega cities. We argue that this aspect is worth exploring in particular with regards to Auto-UI’s overarching goal of making cars both safer and more enjoyable. This paper presents the findings of a pilot study, which included 14 urban informatics research experts participating in a guided ideation (idea creation) workshop within a simulated environment. They were immersed into different driving scenarios to imagine novel urban informatics type of applications specific to the driving context.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The representation of business process models has been a continuing research topic for many years now. However, many process model representations have not developed beyond minimally interactive 2D icon-based representations of directed graphs and networks, with little or no annotation for information over- lays. With the rise of desktop computers and commodity mobile devices capable of supporting rich interactive 3D environments, we believe that much of the research performed in computer human interaction, virtual reality, games and interactive entertainment has much potential in areas of BPM; to engage, pro- vide insight, and to promote collaboration amongst analysts and stakeholders alike. This initial visualization workshop seeks to initiate the development of a high quality international forum to present and discuss research in this field. Via this workshop, we intend to create a community to unify and nurture the development of process visualization topics as a continuing research area.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

New substation automation applications, such as sampled value process buses and synchrophasors, require sampling accuracy of 1 µs or better. The Precision Time Protocol (PTP), IEEE Std 1588, achieves this level of performance and integrates well into Ethernet based substation networks. This paper takes a systematic approach to the performance evaluation of commercially available PTP devices (grandmaster, slave, transparent and boundary clocks) from a variety of manufacturers. The ``error budget'' is set by the performance requirements of each application. The ``expenditure'' of this error budget by each component is valuable information for a system designer. The component information is used to design a synchronization system that meets the overall functional requirements. The quantitative performance data presented shows that this testing is effective and informative. Results from testing PTP performance in the presence of sampled value process bus traffic demonstrate the benefit of a ``bottom up'' component testing approach combined with ``top down'' system verification tests. A test method that uses a precision Ethernet capture card, rather than dedicated PTP test sets, to determine the Correction Field Error of transparent clocks is presented. This test is particularly relevant for highly loaded Ethernet networks with stringent timing requirements. The methods presented can be used for development purposes by manufacturers, or by system integrators for acceptance testing. A sampled value process bus was used as the test application for the systematic approach described in this paper. The test approach was applied, components were selected, and the system performance verified to meet the application's requirements. Systematic testing, as presented in this paper, is applicable to a range of industries that use, rather than develop, PTP for time transfer.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Carbon nanotubes (CNTs) have excellent electrical, mechanical and electromechanical properties. When CNTs are incorporated into polymers, electrically conductive composites with high electrical conductivity at very low CNT content (often below 1% wt CNT) result. Due to the change in electrical properties under mechanical load, carbon nanotube/polymer composites have attracted significant research interest especially due to their potential for application in in-situ monitoring of stress distribution and active control of strain sensing in composite structures or as strain sensors. To sucessfully develop novel devices for such applications, some of the major challenges that need to be overcome include; in-depth understanding of structure-electrical conductivity relationships, response of the composites under changing environmental conditions and piezoresistivity of different types of carbon nanotube/polymer sensing devices. In this thesis, direct current (DC) and alternating current (AC) conductivity of CNT-epoxy composites was investigated. Details of microstructure obtained by scanning electron microscopy were used to link observed electrical properties with structure using equivalent circuit modeling. The role of polymer coatings on macro and micro level electrical conductivity was investigated using atomic force microscopy. Thermal analysis and Raman spectroscopy were used to evaluate the heat flow and deformation of carbon nanotubes embedded in the epoxy, respectively, and related to temperature induced resistivity changes. A comparative assessment of piezoresistivity was conducted using randomly mixed carbon nanotube/epoxy composites, and new concept epoxy- and polyurethane-coated carbon nanotube films. The results indicate that equivalent circuit modelling is a reliable technique for estimating values of the resistance and capacitive components in linear, low aspect ratio-epoxy composites. Using this approach, the dominant role of tunneling resistance in determining the electrical conductivity was confirmed, a result further verified using conductive-atomic force microscopy analysis. Randomly mixed CNT-epoxy composites were found to be highly sensitive to mechanical strain and temperature variation compared to polymer-coated CNT films. In the vicinity of the glass transition temperature, the CNT-epoxy composites exhibited pronounced resistivity peaks. Thermal and Raman spectroscopy analyses indicated that this phenomenon can be attributed to physical aging of the epoxy matrix phase and structural rearrangement of the conductive network induced by matrix expansion. The resistivity of polymercoated CNT composites was mainly dominated by the intrinsic resistivity of CNTs and the CNT junctions, and their linear, weakly temperature sensitive response can be described by a modified Luttinger liquid model. Piezoresistivity of the polymer coated sensors was dominated by break up of the conducting carbon nanotube network and the consequent degradation of nanotube-nanotube contacts while that of the randomly mixed CNT-epoxy composites was determined by tunnelling resistance between neighbouring CNTs. This thesis has demonstrated that it is possible to use microstructure information to develop equivalent circuit models that are capable of representing the electrical conductivity of CNT/epoxy composites accurately. New designs of carbon nanotube based sensing devices, utilising carbon nanotube films as the key functional element, can be used to overcome the high temperature sensitivity of randomly mixed CNT/polymer composites without compromising on desired high strain sensitivity. This concept can be extended to develop large area intelligent CNT based coatings and targeted weak-point specific strain sensors for use in structural health monitoring.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Good management, supported by accurate, timely and reliable health information, is vital for increasing the effectiveness of Health Information Systems (HIS). When it comes to managing the under resourced health systems of developing countries, information-based decision making is particularly important. This paper reports findings of a self-report survey that investigated perceptions of local health managers (HMs) of their own regional HIS in Sri Lanka. Data were collected through a validated, pre-tested postal questionnaire, and distributed among a selected group of HMs to elicit their perceptions of the current HIS in relation to information generation, acquisition and use, required reforms to the information system and application of information and communication technology (ICT). Results based on descriptive statistics indicated that the regional HIS was poorly organised and in need of reform; that management support for the system was unsatisfactory in terms of relevance, accuracy, timeliness and accessibility; that political pressure and community and donor requests took precedence over vital health information when management decisions were made; and use of ICT was unsatisfactory. HIS strengths included user-friendly paper formats, a centralised planning system and an efficient disease notification system; weaknesses were lack of comprehensiveness, inaccuracy, and lack of a feedback system. Responses of participants indicated that HIS would be improved by adopting an internationally accepted framework and introducing ICT applications. Perceived barriers to such improvements were high initial cost of educating staff to improve computer literacy, introduction of ICTs, and HIS restructure. We concluded that the regional HIS of Central Province, Sri Lanka had failed to provide much needed information support to HMs. These findings are consistent with similar research in other developing countries and reinforce the need for further research to verify causes of poor performance and to design strategic reforms to improve HIS in regional Sri Lanka.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Despite the rapidly urbanising population, public transport usage in metropolitan areas is not growing at a level that corresponds to the trend. Many people are reluctant to travel using public transport, as it is commonly associated with unpleasant experiences such as limited services, long wait time, and crowded spaces. This study aims to explore the use of mobile spatial interactions and services, and investigate their potential to increase the enjoyment of our everyday commuting experience. The main goal is to develop and evaluate mobile-mediated design interventions to foster interactions for and among passengers, as well as between passengers and public transport infrastructures, with the aim to positively influence the experience of commuting. Ultimately, this study hopes to generate findings and knowledge towards creating a more enjoyable public transport experience, as well as to explore innovative uses of mobile technologies and context-aware services for the urban lifestyle.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This paper presents an efficient face detection method suitable for real-time surveillance applications. Improved efficiency is achieved by constraining the search window of an AdaBoost face detector to pre-selected regions. Firstly, the proposed method takes a sparse grid of sample pixels from the image to reduce whole image scan time. A fusion of foreground segmentation and skin colour segmentation is then used to select candidate face regions. Finally, a classifier-based face detector is applied only to selected regions to verify the presence of a face (the Viola-Jones detector is used in this paper). The proposed system is evaluated using 640 x 480 pixels test images and compared with other relevant methods. Experimental results show that the proposed method reduces the detection time to 42 ms, where the Viola-Jones detector alone requires 565 ms (on a desktop processor). This improvement makes the face detector suitable for real-time applications. Furthermore, the proposed method requires 50% of the computation time of the best competing method, while reducing the false positive rate by 3.2% and maintaining the same hit rate.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Advances in algorithms for approximate sampling from a multivariable target function have led to solutions to challenging statistical inference problems that would otherwise not be considered by the applied scientist. Such sampling algorithms are particularly relevant to Bayesian statistics, since the target function is the posterior distribution of the unobservables given the observables. In this thesis we develop, adapt and apply Bayesian algorithms, whilst addressing substantive applied problems in biology and medicine as well as other applications. For an increasing number of high-impact research problems, the primary models of interest are often sufficiently complex that the likelihood function is computationally intractable. Rather than discard these models in favour of inferior alternatives, a class of Bayesian "likelihoodfree" techniques (often termed approximate Bayesian computation (ABC)) has emerged in the last few years, which avoids direct likelihood computation through repeated sampling of data from the model and comparing observed and simulated summary statistics. In Part I of this thesis we utilise sequential Monte Carlo (SMC) methodology to develop new algorithms for ABC that are more efficient in terms of the number of model simulations required and are almost black-box since very little algorithmic tuning is required. In addition, we address the issue of deriving appropriate summary statistics to use within ABC via a goodness-of-fit statistic and indirect inference. Another important problem in statistics is the design of experiments. That is, how one should select the values of the controllable variables in order to achieve some design goal. The presences of parameter and/or model uncertainty are computational obstacles when designing experiments but can lead to inefficient designs if not accounted for correctly. The Bayesian framework accommodates such uncertainties in a coherent way. If the amount of uncertainty is substantial, it can be of interest to perform adaptive designs in order to accrue information to make better decisions about future design points. This is of particular interest if the data can be collected sequentially. In a sense, the current posterior distribution becomes the new prior distribution for the next design decision. Part II of this thesis creates new algorithms for Bayesian sequential design to accommodate parameter and model uncertainty using SMC. The algorithms are substantially faster than previous approaches allowing the simulation properties of various design utilities to be investigated in a more timely manner. Furthermore the approach offers convenient estimation of Bayesian utilities and other quantities that are particularly relevant in the presence of model uncertainty. Finally, Part III of this thesis tackles a substantive medical problem. A neurological disorder known as motor neuron disease (MND) progressively causes motor neurons to no longer have the ability to innervate the muscle fibres, causing the muscles to eventually waste away. When this occurs the motor unit effectively ‘dies’. There is no cure for MND, and fatality often results from a lack of muscle strength to breathe. The prognosis for many forms of MND (particularly amyotrophic lateral sclerosis (ALS)) is particularly poor, with patients usually only surviving a small number of years after the initial onset of disease. Measuring the progress of diseases of the motor units, such as ALS, is a challenge for clinical neurologists. Motor unit number estimation (MUNE) is an attempt to directly assess underlying motor unit loss rather than indirect techniques such as muscle strength assessment, which generally is unable to detect progressions due to the body’s natural attempts at compensation. Part III of this thesis builds upon a previous Bayesian technique, which develops a sophisticated statistical model that takes into account physiological information about motor unit activation and various sources of uncertainties. More specifically, we develop a more reliable MUNE method by applying marginalisation over latent variables in order to improve the performance of a previously developed reversible jump Markov chain Monte Carlo sampler. We make other subtle changes to the model and algorithm to improve the robustness of the approach.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

In the last decade, smartphones have gained widespread usage. Since the advent of online application stores, hundreds of thousands of applications have become instantly available to millions of smart-phone users. Within the Android ecosystem, application security is governed by digital signatures and a list of coarse-grained permissions. However, this mechanism is not fine-grained enough to provide the user with a sufficient means of control of the applications' activities. Abuse of highly sensible private information such as phone numbers without users' notice is the result. We show that there is a high frequency of privacy leaks even among widely popular applications. Together with the fact that the majority of the users are not proficient in computer security, this presents a challenge to the engineers developing security solutions for the platform. Our contribution is twofold: first, we propose a service which is able to assess Android Market applications via static analysis and provide detailed, but readable reports to the user. Second, we describe a means to mitigate security and privacy threats by automated reverse-engineering and refactoring binary application packages according to the users' security preferences.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Private data stored on smartphones is a precious target for malware attacks. A constantly changing environment, e.g. switching network connections, can cause unpredictable threats, and require an adaptive approach to access control. Context-based access control is using dynamic environmental information, including it into access decisions. We propose an "ecosystem-in-an-ecosystem" which acts as a secure container for trusted software aiming at enterprise scenarios where users are allowed to use private devices. We have implemented a proof-of-concept prototype for an access control framework that processes changes to low-level sensors and semantically enriches them, adapting access control policies to the current context. This allows the user or the administrator to maintain fine-grained control over resource usage by compliant applications. Hence, resources local to the trusted container remain under control of the enterprise policy. Our results show that context-based access control can be done on smartphones without major performance impact.