958 resultados para Electronic data processing personnel - Certification


Relevância:

100.00% 100.00%

Publicador:

Resumo:

Using Wireless Sensor Networks (WSNs) in healthcare systems has had a lot of attention in recent years. In much of this research tasks like sensor data processing, health states decision making and emergency message sending are done by a remote server. Many patients with lots of sensor data consume a great deal of communication resources, bring a burden to the remote server and delay the decision time and notification time. A healthcare application for elderly people using WSN has been simulated in this paper. A WSN designed for the proposed healthcare application needs efficient Medium Access Control (MAC) and routing protocols to provide a guarantee for the reliability of the data delivered from the patients to the medical centre. Based on these requirements, the GinMAC protocol including a mobility module has been chosen, to provide the required performance such as reliability for data delivery and energy saving. Simulation results show that this modification to GinMAC can offer the required performance for the proposed healthcare application.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Using Wireless Sensor Networks (WSNs) in healthcare systems has had a lot of attention in recent years. In much of this research tasks like sensor data processing, health states decision making and emergency message sending are done by a remote server. Many patients with lots of sensor data consume a great deal of communication resources, bring a burden to the remote server and delay the decision time and notification time. A healthcare application for elderly people using WSN has been simulated in this paper. A WSN designed for the proposed healthcare application needs efficient MAC and routing protocols to provide a guarantee for the reliability of the data delivered from the patients to the medical centre. Based on these requirements, the GinMAC protocol including a mobility module has been chosen, to provide the required performance such as reliability for data delivery and energy saving. Simulation results show that this modification to GinMAC can offer the required performance for the proposed healthcare application.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

This special issue is focused on the assessment of algorithms for the observation of Earth’s climate from environ- mental satellites. Climate data records derived by remote sensing are increasingly a key source of insight into the workings of and changes in Earth’s climate system. Producers of data sets must devote considerable effort and expertise to maximise the true climate signals in their products and minimise effects of data processing choices and changing sensors. A key choice is the selection of algorithm(s) for classification and/or retrieval of the climate variable. Within the European Space Agency Climate Change Initiative, science teams undertook systematic assessment of algorithms for a range of essential climate variables. The papers in the special issue report some of these exercises (for ocean colour, aerosol, ozone, greenhouse gases, clouds, soil moisture, sea surface temper- ature and glaciers). The contributions show that assessment exercises must be designed with care, considering issues such as the relative importance of different aspects of data quality (accuracy, precision, stability, sensitivity, coverage, etc.), the availability and degree of independence of validation data and the limitations of validation in characterising some important aspects of data (such as long-term stability or spatial coherence). As well as re- quiring a significant investment of expertise and effort, systematic comparisons are found to be highly valuable. They reveal the relative strengths and weaknesses of different algorithmic approaches under different observa- tional contexts, and help ensure that scientific conclusions drawn from climate data records are not influenced by observational artifacts, but are robust.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Objectives. To study mortality trends related to Chagas disease taking into account all mentions of this cause listed on any line or part of the death certificate. Methods. Mortality data for 1985-2006 were obtained from the multiple cause-of-death database maintained by the Sao Paulo State Data Analysis System (SEADE). Chagas disease was classified as the underlying cause-of-death or as an associated cause-of-death (non-underlying). The total number of times Chagas disease was mentioned on the death certificates was also considered. Results. During this 22-year period, there were 40 002 deaths related to Chagas disease: 34 917 (87.29%) classified as the underlying cause-of-death and 5 085 (12.71%) as an associated cause-of-death. The results show a 56.07% decline in the death rate due to Chagas disease as the underlying cause and a stabilized rate as associated cause. The number of deaths was 44.5% higher among men. The fact that 83.5% of the deaths occurred after 45 years of age reflects a cohort effect. The main causes associated with Chagas disease as the underlying cause-of-death were direct complications due to cardiac involvement, such as conduction disorders, arrhythmias and heart failure. Ischemic heart disease, cerebrovascular disorders and neoplasms were the main underlying causes when Chagas was an associated cause-of-death. Conclusions. For the total mentions to Chagas disease, a 51.34% decline in the death rate was observed, whereas the decline in the number of deaths was only 5.91%, being lower among women and showing a shift of deaths to older age brackets. Using the multiple cause-of-death method contributed to the understanding of the natural history of Chagas disease.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

OBJECTIVES: To develop a method for objective assessment of fine motor timing variability in Parkinson’s disease (PD) patients, using digital spiral data gathered by a touch screen device. BACKGROUND: A retrospective analysis was conducted on data from 105 subjects including65 patients with advanced PD (group A), 15 intermediate patients experiencing motor fluctuations (group I), 15 early stage patients (group S), and 10 healthy elderly subjects (HE) were examined. The subjects were asked to perform repeated upper limb motor tasks by tracing a pre-drawn Archimedes spiral as shown on the screen of the device. The spiral tracing test was performed using an ergonomic pen stylus, using dominant hand. The test was repeated three times per test occasion and the subjects were instructed to complete it within 10 seconds. Digital spiral data including stylus position (x-ycoordinates) and timestamps (milliseconds) were collected and used in subsequent analysis. The total number of observations with the test battery were as follows: Swedish group (n=10079), Italian I group (n=822), Italian S group (n = 811), and HE (n=299). METHODS: The raw spiral data were processed with three data processing methods. To quantify motor timing variability during spiral drawing tasks Approximate Entropy (APEN) method was applied on digitized spiral data. APEN is designed to capture the amount of irregularity or complexity in time series. APEN requires determination of two parameters, namely, the window size and similarity measure. In our work and after experimentation, window size was set to 4 and similarity measure to 0.2 (20% of the standard deviation of the time series). The final score obtained by APEN was normalized by total drawing completion time and used in subsequent analysis. The score generated by this method is hence on denoted APEN. In addition, two more methods were applied on digital spiral data and their scores were used in subsequent analysis. The first method was based on Digital Wavelet Transform and Principal Component Analysis and generated a score representing spiral drawing impairment. The score generated by this method is hence on denoted WAV. The second method was based on standard deviation of frequency filtered drawing velocity. The score generated by this method is hence on denoted SDDV. Linear mixed-effects (LME) models were used to evaluate mean differences of the spiral scores of the three methods across the four subject groups. Test-retest reliability of the three scores was assessed after taking mean of the three possible correlations (Spearman’s rank coefficients) between the three test trials. Internal consistency of the methods was assessed by calculating correlations between their scores. RESULTS: When comparing mean spiral scores between the four subject groups, the APEN scores were different between HE subjects and three patient groups (P=0.626 for S group with 9.9% mean value difference, P=0.089 for I group with 30.2%, and P=0.0019 for A group with 44.1%). However, there were no significant differences in mean scores of the other two methods, except for the WAV between the HE and A groups (P<0.001). WAV and SDDV were highly and significantly correlated to each other with a coefficient of 0.69. However, APEN was not correlated to neither WAV nor SDDV with coefficients of 0.11 and 0.12, respectively. Test-retest reliability coefficients of the three scores were as follows: APEN (0.9), WAV(0.83) and SD-DV (0.55). CONCLUSIONS: The results show that the digital spiral analysis-based objective APEN measure is able to significantly differentiate the healthy subjects from patients at advanced level. In contrast to the other two methods (WAV and SDDV) that are designed to quantify dyskinesias (over-medications), this method can be useful for characterizing Off symptoms in PD. The APEN was not correlated to none of the other two methods indicating that it measures a different construct of upper limb motor function in PD patients than WAV and SDDV. The APEN also had a better test-retest reliability indicating that it is more stable and consistent over time than WAV and SDDV.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Exploration with formal design systems comprises an iterative process of specifying problems, finding plausible and alternative solutions, judging the validity of solutions relative to problems and reformulating problems and solutions. Recent advances in formal generative design have developed the mathematics and algorithms to describe and perform conceptual design tasks. However, design remains a human enterprise: formalisms are part of a larger equation comprising human computer interaction. To support the user in designing with formal systems, shared representations that interleave initiative of the designer and the design formalism are necessary. The problem of devising representational structures in which initiative is sometimes taken by the designer and sometimes by a computer in working on a shared design task is reported in this paper. To address this problem, the requirements, representation and
implementation of a shared interaction construct, the feature node, is
described. The feature node facilitates the sharing of initiative in formulating and reformulating problems, generating solutions, making
choices and navigating the history of exploration.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

This chapter reports the results of a feasibility study into electronic collection of service data at “point of delivery” for disability programs. The investigation revealed that while the proposed system would have produced more fine-grained data, it would not have improved any actor’s knowledge of service delivery. The study illustrated the importance of context in the transition from data to knowledge; the diffused and fragmented organisational structure of social service administration was shown to be a major barrier to effective building and sharing of knowledge. There was some value in the collection of detailed service data but this would have damaged the web of relationships which underpinned the system of service delivery and on which the smooth functioning of that system depended. The study recommended an approach to managing the informal and tacit knowledge distributed among many stakeholders, which was not especially technologically advanced but which supported, in a highly situated manner, the various stakeholders in this multi-organisational context.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The adoption of electronic commerce strategies is becoming an important means of assisting industries, and indeed whole economies, to gain significant net benefits. The extent to which e-commerce-based strategies, such as quick response and efficient consumer response, might have an effect on local economies depends in part on how readily they are being adopted. The dominant form of adoption of these strategies is to be found in the business-to-business forms of e-commerce. To be successful, business partners must be in a position to develop customer intimacy through sharing of information, to improve their stock replenishment practices, and enhance their levels of online customer support. This paper presents the initial results of a national survey completed in the retail sector of the Australian economy, that assesses how well Australian industry is responding to these e-commerce challenges.


Relevância:

100.00% 100.00%

Publicador:

Resumo:

The provenance of entities, whether electronic data or physical artefacts, is crucial information in practically all domains, including science, business and art. The increased use of software in automating activities provides the opportunity to add greatly to the amount we can know about an entityâ??s history and the process by which it came to be as it is. However, it also presents difficulties: querying for the provenance of an entity could potentially return detailed information stretching back to the beginning of time, and most of it possibly irrelevant to the querier. In this paper, we define the concept of provenance query and describe techniques that allow us to perform scoped provenance queries.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Recovering the control or implicit geometry underlying temple architecture requires bringing together fragments of evidence from field measurements, relating these to mathematical and geometric descriptions in canonical texts and proposing "best-fit" constructive models. While scholars in the field have traditionally used manual methods, the innovative application of niche computational techniques can help extend the study of artefact geometry. This paper demonstrates the application of a hybrid computational approach to the problem of recovering the surface geometry of early temple superstructures. The approach combines field measurements of temples, close-range architectural photogrammetry, rule-based generation and parametric modelling. The computing of surface geometry comprises a rule-based global model governing the overall form of the superstructure, several local models for individual motifs using photogrammetry and an intermediate geometry model that combines the two. To explain the technique and the different models, the paper examines an illustrative example of surface geometry reconstruction based on studies undertaken on a tenth century stone superstructure from western India. The example demonstrates that a combination of computational methods yields sophisticated models of the constructive geometry underlying temple form and that these digital artefacts can form the basis for in depth comparative analysis of temples, arising out of similar techniques, spread over geography, culture and time.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Over the last few years, perceptions of the importance of eHealth have increased rapidly, together with the use of IS&T in the delivery of health and social services. Although “e” approaches to health and social services have much potential, they are not panaceas, and the use of new technologies in improving the efficiency and effectiveness of such systems cannot be considered in isolation from their wider context. eHealth systems remain complex socio-organisational systems and, as we will argue and illustrate through this case study, require that a balanced approach to feasibility and desirability analysis be taken.

The case study in this paper describes a feasibility study into the potential effectiveness of a smartdevice-based electronic data collection and payment system which was proposed for the provision of disability services. A key finding of the study was that the most significant impediment to such a system was the highly diffused, fragmented, interlocking organisational structure of the social service administration itself. Rather than raise issues specific to the implementation or diffusion of new technologies in designing e-health services, it raised issues associated with decision making and control in such an environment, and with the design of the underlying organisational system: for service provision, the level of detail required in the service data, and the locus of decision-making power among the stakeholders.

In our account we illustrate the existence of multiple, incommensurate but valid perceptions of the human service provision problem, and discuss the implications for developers or managers of information systems in the arena of e-health or governance. We examine this environment from sociological and information systems perspectives, and confirm the usefulness of socio-organisational approaches in understanding such contexts.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The rapid development of network technologies has made the web a huge information source with its own characteristics. In most cases, traditional database-based technologies are no longer suitable for web information processing and management. For effectively processing and managing web information, it is necessary to reveal intrinsic relationships/structures among concerned web information objects such as web pages. In this work, a set of web pages that have their intrinsic relationships is called a web page community. This paper proposes a matrix-based model to describe relationships among concerned web pages. Based on this model, intrinsic relationships among pages could be revealed, and in turn a web page community could be constructed. The issues that are related to the application of the model are deeply investigated and studied. The concepts of community and intrinsic relationships, as well as the proposed matrix-based model, are then extended to other application areas such as biological data processing. Some application cases of the model in a broad range of areas are presented, demonstrating the potentials of this matrix-based model.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

As a teaching practice the application of cooperative learning in tertiary education can present unique challenges for both the practitioner and her students. Mastering this teaching approach requires careful planning, design and implementation for effective deployment in a face-to-face setting. In this setting the success of the cooperative learning approach has been demonstrated. The complexity is significantly increased by additional variables such as the selection and application of technological teaching tools and the change in nature of existing variables including awareness of students' social and communication skills when applying this practice in an Online Learning Environment (OLE). In addition student acceptance of this e-learning approach to learning also needs to be carefully considered. The practitioner must be aware of these factors and have suitable methods in place to support collaboration and interaction between student groups to ensure the ultimate goal with regard to students' learning is achieved. This paper considers how cooperative learning can be combined effectively with these variables and factors of an OLE, and begins with the presentation of a conceptual framework to represent this relationship as a constructive teaching practice. It is anticipated that the conceptual framework would be applied by other practitioners to facilitate cooperative teaching within their OLE. To demonstrate the validity of the framework a case scenario is provided using an Information Technology (IT) undergraduate unit named 'IT Practice'. This is a wholly online unit where extensive participation by the students within small groups is a core requirement. The paper demonstrates the themes of designing curriculum and assessment, as well as flexible teaching strategies for learner diversity but primarily concentrates on managing an effective OLE; that is managing small groups in an online teaching environment.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Established supply chain management techniques such as Quick Response (QR) or Customer Relationship Management (CRM) have proven the potential benefits of reorganizing an organization’s processes to take advantage of the characteristics of electronic information exchange. As the Internet and other proprietary networks expand, however, organizations have the opportunity to use this enabling infrastructure to exchange other, more varied types of information than traditional electronic data interchange (EDI) messages. This is especially true of companies with global operations and interests, which lead to a more diverse set of trading activities. This case presents the experiences of a large Australian paper products manufacturer in implementing an electronic document exchange strategy for supply chain management, including the drivers for change which spurred their actions, and describes the issues associated with trying to support existing and future requirements for document exchange across a wide variety of trading partners. The experiences of PaperCo will be relevant to organizations with diverse trading partners, especially small to medium enterprises (SMEs).