139 resultados para pacs: information technolgy applications


Relevância:

30.00% 30.00%

Publicador:

Resumo:

New substation automation applications, such as sampled value process buses and synchrophasors, require sampling accuracy of 1 µs or better. The Precision Time Protocol (PTP), IEEE Std 1588, achieves this level of performance and integrates well into Ethernet based substation networks. This paper takes a systematic approach to the performance evaluation of commercially available PTP devices (grandmaster, slave, transparent and boundary clocks) from a variety of manufacturers. The ``error budget'' is set by the performance requirements of each application. The ``expenditure'' of this error budget by each component is valuable information for a system designer. The component information is used to design a synchronization system that meets the overall functional requirements. The quantitative performance data presented shows that this testing is effective and informative. Results from testing PTP performance in the presence of sampled value process bus traffic demonstrate the benefit of a ``bottom up'' component testing approach combined with ``top down'' system verification tests. A test method that uses a precision Ethernet capture card, rather than dedicated PTP test sets, to determine the Correction Field Error of transparent clocks is presented. This test is particularly relevant for highly loaded Ethernet networks with stringent timing requirements. The methods presented can be used for development purposes by manufacturers, or by system integrators for acceptance testing. A sampled value process bus was used as the test application for the systematic approach described in this paper. The test approach was applied, components were selected, and the system performance verified to meet the application's requirements. Systematic testing, as presented in this paper, is applicable to a range of industries that use, rather than develop, PTP for time transfer.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Carbon nanotubes (CNTs) have excellent electrical, mechanical and electromechanical properties. When CNTs are incorporated into polymers, electrically conductive composites with high electrical conductivity at very low CNT content (often below 1% wt CNT) result. Due to the change in electrical properties under mechanical load, carbon nanotube/polymer composites have attracted significant research interest especially due to their potential for application in in-situ monitoring of stress distribution and active control of strain sensing in composite structures or as strain sensors. To sucessfully develop novel devices for such applications, some of the major challenges that need to be overcome include; in-depth understanding of structure-electrical conductivity relationships, response of the composites under changing environmental conditions and piezoresistivity of different types of carbon nanotube/polymer sensing devices. In this thesis, direct current (DC) and alternating current (AC) conductivity of CNT-epoxy composites was investigated. Details of microstructure obtained by scanning electron microscopy were used to link observed electrical properties with structure using equivalent circuit modeling. The role of polymer coatings on macro and micro level electrical conductivity was investigated using atomic force microscopy. Thermal analysis and Raman spectroscopy were used to evaluate the heat flow and deformation of carbon nanotubes embedded in the epoxy, respectively, and related to temperature induced resistivity changes. A comparative assessment of piezoresistivity was conducted using randomly mixed carbon nanotube/epoxy composites, and new concept epoxy- and polyurethane-coated carbon nanotube films. The results indicate that equivalent circuit modelling is a reliable technique for estimating values of the resistance and capacitive components in linear, low aspect ratio-epoxy composites. Using this approach, the dominant role of tunneling resistance in determining the electrical conductivity was confirmed, a result further verified using conductive-atomic force microscopy analysis. Randomly mixed CNT-epoxy composites were found to be highly sensitive to mechanical strain and temperature variation compared to polymer-coated CNT films. In the vicinity of the glass transition temperature, the CNT-epoxy composites exhibited pronounced resistivity peaks. Thermal and Raman spectroscopy analyses indicated that this phenomenon can be attributed to physical aging of the epoxy matrix phase and structural rearrangement of the conductive network induced by matrix expansion. The resistivity of polymercoated CNT composites was mainly dominated by the intrinsic resistivity of CNTs and the CNT junctions, and their linear, weakly temperature sensitive response can be described by a modified Luttinger liquid model. Piezoresistivity of the polymer coated sensors was dominated by break up of the conducting carbon nanotube network and the consequent degradation of nanotube-nanotube contacts while that of the randomly mixed CNT-epoxy composites was determined by tunnelling resistance between neighbouring CNTs. This thesis has demonstrated that it is possible to use microstructure information to develop equivalent circuit models that are capable of representing the electrical conductivity of CNT/epoxy composites accurately. New designs of carbon nanotube based sensing devices, utilising carbon nanotube films as the key functional element, can be used to overcome the high temperature sensitivity of randomly mixed CNT/polymer composites without compromising on desired high strain sensitivity. This concept can be extended to develop large area intelligent CNT based coatings and targeted weak-point specific strain sensors for use in structural health monitoring.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Good management, supported by accurate, timely and reliable health information, is vital for increasing the effectiveness of Health Information Systems (HIS). When it comes to managing the under resourced health systems of developing countries, information-based decision making is particularly important. This paper reports findings of a self-report survey that investigated perceptions of local health managers (HMs) of their own regional HIS in Sri Lanka. Data were collected through a validated, pre-tested postal questionnaire, and distributed among a selected group of HMs to elicit their perceptions of the current HIS in relation to information generation, acquisition and use, required reforms to the information system and application of information and communication technology (ICT). Results based on descriptive statistics indicated that the regional HIS was poorly organised and in need of reform; that management support for the system was unsatisfactory in terms of relevance, accuracy, timeliness and accessibility; that political pressure and community and donor requests took precedence over vital health information when management decisions were made; and use of ICT was unsatisfactory. HIS strengths included user-friendly paper formats, a centralised planning system and an efficient disease notification system; weaknesses were lack of comprehensiveness, inaccuracy, and lack of a feedback system. Responses of participants indicated that HIS would be improved by adopting an internationally accepted framework and introducing ICT applications. Perceived barriers to such improvements were high initial cost of educating staff to improve computer literacy, introduction of ICTs, and HIS restructure. We concluded that the regional HIS of Central Province, Sri Lanka had failed to provide much needed information support to HMs. These findings are consistent with similar research in other developing countries and reinforce the need for further research to verify causes of poor performance and to design strategic reforms to improve HIS in regional Sri Lanka.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Despite the rapidly urbanising population, public transport usage in metropolitan areas is not growing at a level that corresponds to the trend. Many people are reluctant to travel using public transport, as it is commonly associated with unpleasant experiences such as limited services, long wait time, and crowded spaces. This study aims to explore the use of mobile spatial interactions and services, and investigate their potential to increase the enjoyment of our everyday commuting experience. The main goal is to develop and evaluate mobile-mediated design interventions to foster interactions for and among passengers, as well as between passengers and public transport infrastructures, with the aim to positively influence the experience of commuting. Ultimately, this study hopes to generate findings and knowledge towards creating a more enjoyable public transport experience, as well as to explore innovative uses of mobile technologies and context-aware services for the urban lifestyle.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This paper presents an efficient face detection method suitable for real-time surveillance applications. Improved efficiency is achieved by constraining the search window of an AdaBoost face detector to pre-selected regions. Firstly, the proposed method takes a sparse grid of sample pixels from the image to reduce whole image scan time. A fusion of foreground segmentation and skin colour segmentation is then used to select candidate face regions. Finally, a classifier-based face detector is applied only to selected regions to verify the presence of a face (the Viola-Jones detector is used in this paper). The proposed system is evaluated using 640 x 480 pixels test images and compared with other relevant methods. Experimental results show that the proposed method reduces the detection time to 42 ms, where the Viola-Jones detector alone requires 565 ms (on a desktop processor). This improvement makes the face detector suitable for real-time applications. Furthermore, the proposed method requires 50% of the computation time of the best competing method, while reducing the false positive rate by 3.2% and maintaining the same hit rate.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Advances in algorithms for approximate sampling from a multivariable target function have led to solutions to challenging statistical inference problems that would otherwise not be considered by the applied scientist. Such sampling algorithms are particularly relevant to Bayesian statistics, since the target function is the posterior distribution of the unobservables given the observables. In this thesis we develop, adapt and apply Bayesian algorithms, whilst addressing substantive applied problems in biology and medicine as well as other applications. For an increasing number of high-impact research problems, the primary models of interest are often sufficiently complex that the likelihood function is computationally intractable. Rather than discard these models in favour of inferior alternatives, a class of Bayesian "likelihoodfree" techniques (often termed approximate Bayesian computation (ABC)) has emerged in the last few years, which avoids direct likelihood computation through repeated sampling of data from the model and comparing observed and simulated summary statistics. In Part I of this thesis we utilise sequential Monte Carlo (SMC) methodology to develop new algorithms for ABC that are more efficient in terms of the number of model simulations required and are almost black-box since very little algorithmic tuning is required. In addition, we address the issue of deriving appropriate summary statistics to use within ABC via a goodness-of-fit statistic and indirect inference. Another important problem in statistics is the design of experiments. That is, how one should select the values of the controllable variables in order to achieve some design goal. The presences of parameter and/or model uncertainty are computational obstacles when designing experiments but can lead to inefficient designs if not accounted for correctly. The Bayesian framework accommodates such uncertainties in a coherent way. If the amount of uncertainty is substantial, it can be of interest to perform adaptive designs in order to accrue information to make better decisions about future design points. This is of particular interest if the data can be collected sequentially. In a sense, the current posterior distribution becomes the new prior distribution for the next design decision. Part II of this thesis creates new algorithms for Bayesian sequential design to accommodate parameter and model uncertainty using SMC. The algorithms are substantially faster than previous approaches allowing the simulation properties of various design utilities to be investigated in a more timely manner. Furthermore the approach offers convenient estimation of Bayesian utilities and other quantities that are particularly relevant in the presence of model uncertainty. Finally, Part III of this thesis tackles a substantive medical problem. A neurological disorder known as motor neuron disease (MND) progressively causes motor neurons to no longer have the ability to innervate the muscle fibres, causing the muscles to eventually waste away. When this occurs the motor unit effectively ‘dies’. There is no cure for MND, and fatality often results from a lack of muscle strength to breathe. The prognosis for many forms of MND (particularly amyotrophic lateral sclerosis (ALS)) is particularly poor, with patients usually only surviving a small number of years after the initial onset of disease. Measuring the progress of diseases of the motor units, such as ALS, is a challenge for clinical neurologists. Motor unit number estimation (MUNE) is an attempt to directly assess underlying motor unit loss rather than indirect techniques such as muscle strength assessment, which generally is unable to detect progressions due to the body’s natural attempts at compensation. Part III of this thesis builds upon a previous Bayesian technique, which develops a sophisticated statistical model that takes into account physiological information about motor unit activation and various sources of uncertainties. More specifically, we develop a more reliable MUNE method by applying marginalisation over latent variables in order to improve the performance of a previously developed reversible jump Markov chain Monte Carlo sampler. We make other subtle changes to the model and algorithm to improve the robustness of the approach.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

In the last decade, smartphones have gained widespread usage. Since the advent of online application stores, hundreds of thousands of applications have become instantly available to millions of smart-phone users. Within the Android ecosystem, application security is governed by digital signatures and a list of coarse-grained permissions. However, this mechanism is not fine-grained enough to provide the user with a sufficient means of control of the applications' activities. Abuse of highly sensible private information such as phone numbers without users' notice is the result. We show that there is a high frequency of privacy leaks even among widely popular applications. Together with the fact that the majority of the users are not proficient in computer security, this presents a challenge to the engineers developing security solutions for the platform. Our contribution is twofold: first, we propose a service which is able to assess Android Market applications via static analysis and provide detailed, but readable reports to the user. Second, we describe a means to mitigate security and privacy threats by automated reverse-engineering and refactoring binary application packages according to the users' security preferences.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Private data stored on smartphones is a precious target for malware attacks. A constantly changing environment, e.g. switching network connections, can cause unpredictable threats, and require an adaptive approach to access control. Context-based access control is using dynamic environmental information, including it into access decisions. We propose an "ecosystem-in-an-ecosystem" which acts as a secure container for trusted software aiming at enterprise scenarios where users are allowed to use private devices. We have implemented a proof-of-concept prototype for an access control framework that processes changes to low-level sensors and semantically enriches them, adapting access control policies to the current context. This allows the user or the administrator to maintain fine-grained control over resource usage by compliant applications. Hence, resources local to the trusted container remain under control of the enterprise policy. Our results show that context-based access control can be done on smartphones without major performance impact.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Smartphones get increasingly popular where more and more smartphone platforms emerge. Special attention was gained by the open source platform Android which was presented by the Open Handset Alliance (OHA) hosting members like Google, Motorola, and HTC. Android uses a Linux kernel and a stripped-down userland with a custom Java VM set on top. The resulting system joins the advantages of both environments, while third-parties are intended to develop only Java applications at the moment. In this work, we present the benefit of using native applications in Android. Android includes a fully functional Linux, and using it for heavy computational tasks when developing applications can bring in substantional performance increase. We present how to develop native applications and software components, as well as how to let Linux applications and components communicate with Java programs. Additionally, we present performance measurements of native and Java applications executing identical tasks. The results show that native C applications can be up to 30 times as fast as an identical algorithm running in Dalvik VM. Java applications can become a speed-up of up to 10 times if utilizing JNI.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This report looks at opportunities in relation to what is either already available or starting to take off in Information and Communication Technology (ICT). ICT focuses on the entire system of information, communication, processes and knowledge within an organisation. It focuses on how technology can be implemented to serve the information and communication needs of people and organisations. An ICT system involves a combination of work practices, information, people and a range of technologies and applications organised to make the business or organisation fully functional and efficient, and to accomplish goals in an organisation. Our focus is on vocational, workbased education in New Zealand. It is not about eLearning, although we briefly touch on the topic. We provide a background on vocational education in New Zealand, cover what we consider to be key trends impacting workbased, vocational education and training (VET), and offer practical suggestions for leveraging better value from ICT initiatives across the main activities of an Industry Training Organisation (ITO). We use a learning value chain approach to demonstrate the main functions ITOs engage in and also use this approach as the basis for developing and prioritising an ICT strategy. Much of what we consider in this report is applicable to the wider tertiary education sector as it relates to life-long learning. We consider ICT as an enabler that: a) connects education businesses (all types including tertiary education institutions) to learners, their career decisions and their learning, and as well, b) enables those same businesses to run more efficiently. We suggest that these two sets of activities are considered as interconnected parts of the same education or training business ICT strategy.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Qualitative research methods are widely accepted in Information Systems and multiple approaches have been successfully used in IS qualitative studies over the years. These approaches include narrative analysis, discourse analysis, grounded theory, case study, ethnography and phenomenological analysis. Guided by critical, interpretive and positivist epistemologies (Myers 1997), qualitative methods are continuously growing in importance in our research community. In this special issue, we adopt Van Maanen's (1979: 520) definition of qualitative research as an umbrella term to cover an “array of interpretive techniques that can describe, decode, translate, and otherwise come to terms with the meaning, not the frequency, of certain more or less naturally occurring phenomena in the social world”. In the call for papers, we stated that the aim of the special issue was to provide a forum within which we can present and debate the significant number of issues, results and questions arising from the pluralistic approach to qualitative research in Information Systems. We recognise both the potential and the challenges that qualitative approaches offers for accessing the different layers and dimensions of a complex and constructed social reality (Orlikowski, 1993). The special issue is also a response to the need to showcase the current state of the art in IS qualitative research and highlight advances and issues encountered in the process of continuous learning that includes questions about its ontology, epistemological tenets, theoretical contributions and practical applications.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Cooperative Systems provide, through the multiplication of information sources over the road, a lot of potential to improve the safety of road users, especially drivers. However, developing cooperative ITS applications requires additional resources compared to non-cooperative applications which are both time consuming and expensive. In this paper, we present a simulation architecture aimed at prototyping cooperative ITS applications in an accurate and detailed, close-to-reality environment; the architecture is designed to be modular and generalist. It can be used to simulate any type of CS applications as well as augmented perception. Then, we discuss the results of two applications deployed with our architecture, using a common freeway emergency braking scenario. The first application is Emergency Electronic Brake Light (EEBL); we discuss improvements in safety in terms of the number of crashes and the severity of crashes. The second application compares the performance of a cooperative risk assessment using an augmented map against a non-cooperative approach based on local-perception only. Our results show a systematic improvement of forward warning time for most vehicles in the string when using the augmented-map-based risk assessment.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This project was a step forward in developing and evaluating a novel, mathematical model that can deduce the meaning of words based on their use in language. This model can be applied to a wide range of natural language applications, including the information seeking process most of us undertake on a daily basis.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Social media tools are often the result of innovations in Information Technology and developed by IT professionals and innovators. Nevertheless, IT professionals, many of whom are responsible for designing and building social media technologies, have not been investigated on how they themselves use or experience social media for professional purposes. This study will use Information Grounds Theory (Pettigrew, 1998) as a framework to study IT professionals’ experience in using social media for professional purposes. Information grounds facilitates the opportunistic discovery of information within social settings created temporarily at a place where people gather for a specific purpose (e.g., doctors’ waiting rooms, office tea rooms etc.), but the social atmosphere stimulates spontaneous sharing of information (Pettigrew, 1999). This study proposes that social media has the qualities that make it a rich information grounds; people participate from separate “places” in cyberspace in a synchronous manner in real-time, making it almost as dynamic and unplanned as physical information grounds. There is limited research on how social media platforms are perceived as a “place,” (a place to go to, a place to gather, or a place to be seen in) that is comparable to physical spaces. There is also no empirical study on how IT professionals use or “experience” social media. The data for this study is being collected through a study of IT professionals who currently use Twitter. A digital ethnography approach is being taken wherein the researcher uses online observations and “follows” the participants online and observes their behaviours and interactions on social media. Next, a sub-set of participants will be interviewed on their experiences with and within social media and how social media compares with traditional methods of information grounds, information communication, and collaborative environments. An Evolved Grounded Theory (Glaser, 1992) approach will be used to analyse tweets data and interviews and to map the findings against the Information Ground Theory. Findings from this study will provide foundational understanding of IT professionals’ experiences within social media, and can help both professionals and researchers understand this fast-evolving method of communications.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Modernized GPS and GLONASS, together with new GNSS systems, BeiDou and Galileo, offer code and phase ranging signals in three or more carriers. Traditionally, dual-frequency code and/or phase GPS measurements are linearly combined to eliminate effects of ionosphere delays in various positioning and analysis. This typical treatment method has imitations in processing signals at three or more frequencies from more than one system and can be hardly adapted itself to cope with the booming of various receivers with a broad variety of singles. In this contribution, a generalized-positioning model that the navigation system independent and the carrier number unrelated is promoted, which is suitable for both single- and multi-sites data processing. For the synchronization of different signals, uncalibrated signal delays (USD) are more generally defined to compensate the signal specific offsets in code and phase signals respectively. In addition, the ionospheric delays are included in the parameterization with an elaborate consideration. Based on the analysis of the algebraic structures, this generalized-positioning model is further refined with a set of proper constrains to regularize the datum deficiency of the observation equation system. With this new model, uncalibrated signal delays (USD) and ionospheric delays are derived for both GPS and BeiDou with a large dada set. Numerical results demonstrate that, with a limited number of stations, the uncalibrated code delays (UCD) are determinate to a precision of about 0.1 ns for GPS and 0.4 ns for BeiDou signals, while the uncalibrated phase delays (UPD) for L1 and L2 are generated with 37 stations evenly distributed in China for GPS with a consistency of about 0.3 cycle. Extra experiments concerning the performance of this novel model in point positioning with mixed-frequencies of mixed-constellations is analyzed, in which the USD parameters are fixed with our generated values. The results are evaluated in terms of both positioning accuracy and convergence time.