875 resultados para Data and Information Technology
Resumo:
Modern power networks incorporate communications and information technology infrastructure into the electrical power system to create a smart grid in terms of control and operation. The smart grid enables real-time communication and control between consumers and utility companies allowing suppliers to optimize energy usage based on price preference and system technical issues. The smart grid design aims to provide overall power system monitoring, create protection and control strategies to maintain system performance, stability and security. This dissertation contributed to the development of a unique and novel smart grid test-bed laboratory with integrated monitoring, protection and control systems. This test-bed was used as a platform to test the smart grid operational ideas developed here. The implementation of this system in the real-time software creates an environment for studying, implementing and verifying novel control and protection schemes developed in this dissertation. Phasor measurement techniques were developed using the available Data Acquisition (DAQ) devices in order to monitor all points in the power system in real time. This provides a practical view of system parameter changes, system abnormal conditions and its stability and security information system. These developments provide valuable measurements for technical power system operators in the energy control centers. Phasor Measurement technology is an excellent solution for improving system planning, operation and energy trading in addition to enabling advanced applications in Wide Area Monitoring, Protection and Control (WAMPAC). Moreover, a virtual protection system was developed and implemented in the smart grid laboratory with integrated functionality for wide area applications. Experiments and procedures were developed in the system in order to detect the system abnormal conditions and apply proper remedies to heal the system. A design for DC microgrid was developed to integrate it to the AC system with appropriate control capability. This system represents realistic hybrid AC/DC microgrids connectivity to the AC side to study the use of such architecture in system operation to help remedy system abnormal conditions. In addition, this dissertation explored the challenges and feasibility of the implementation of real-time system analysis features in order to monitor the system security and stability measures. These indices are measured experimentally during the operation of the developed hybrid AC/DC microgrids. Furthermore, a real-time optimal power flow system was implemented to optimally manage the power sharing between AC generators and DC side resources. A study relating to real-time energy management algorithm in hybrid microgrids was performed to evaluate the effects of using energy storage resources and their use in mitigating heavy load impacts on system stability and operational security.
Resumo:
Hospitality organizations are embracing technology in all its aspects to ensure that they can effectively compete in today's market The author cites the results of a survey of corporate executives designed to assess how technology is affecting their organizations.
Resumo:
In response to the recent wide-scale applications of lnformation Technology (I/T) in the hospitality industry, this study analyzed articles in leading hospitality research journals, including the International Journal of Hospitality Management, Cornell Hotel and Restaurant Administration Quarterly, and the Journal of Hospitality & Tourism Research published in the period 1985 to 2004. A total of 1,896 full-length papers were published in these journals during the study period. Excluding book reviews, research notes, and comments from editors and readers, 130 full-length IT-related papers were identified. These papers were then grouped into six defined categories of IT. The findings revealed that during the entire study period, the largest number of publications were in general business applications, whereas the highest growth rate from the first decade to the second decade were in articles on networking
Resumo:
Successful introduction of information technology applications in various operations of hotel management is vital to most service firms. In recent decades, technologies of information, automation, and communication are increasingly recognized as essential components of a hotel company’s strategic plan. In this study, 62 super-deluxe hotels (5 star), deluxe hotels (4 star), and tourist hotels (3 star) in Korea are examined for differences in the impact of information technology services on guest’ satisfaction, guest convenience, and operational efficiency. The findings generally suggest that the impacts of information technology-enhanced services vary according to the category of hotels in Korea. The results of the study are expected to assist managers in the selections and implementation of information technology systems in their hotel.
Resumo:
The age of organic material discharged by rivers provides information about its sources and carbon cycling processes within watersheds. While elevated ages in fluvially-transported organic matter are usually explained by erosion of soils and sediments, it is commonly assumed that mainly young organic material is discharged from flat tropical watersheds due to their extensive plant cover and high carbon turnover. Here we present compound-specific radiocarbon data of terrigenous organic fractions from a sedimentary archive offshore the Congo River in conjunction with molecular markers for methane-producing land cover reflecting wetland extent in the watershed. We find that the Congo River has been discharging aged organic matter for several thousand years with increasing ages from the mid- to the Late Holocene. This suggests that aged organic matter in modern samples is concealed by radiocarbon from nuclear weapons testing. By comparison to indicators for past rainfall changes we detect a systematic control of organic matter sequestration and release by continental hydrology mediating temporary carbon storage in wetlands. As aridification also leads to exposure and rapid remineralization of large amounts of previously stored labile organic matter we infer that this process may cause a profound direct climate feedback currently underestimated in carbon cycle assessments.
Resumo:
The Lena River Delta, situated in Northern Siberia (72.0 - 73.8° N, 122.0 - 129.5° E), is the largest Arctic delta and covers 29,000 km**2. Since natural deltas are characterised by complex geomorphological patterns and various types of ecosystems, high spatial resolution information on the distribution and extent of the delta environments is necessary for a spatial assessment and accurate quantification of biogeochemical processes as drivers for the emission of greenhouse gases from tundra soils. In this study, the first land cover classification for the entire Lena Delta based on Landsat 7 Enhanced Thematic Mapper (ETM+) images was conducted and used for the quantification of methane emissions from the delta ecosystems on the regional scale. The applied supervised minimum distance classification was very effective with the few ancillary data that were available for training site selection. Nine land cover classes of aquatic and terrestrial ecosystems in the wetland dominated (72%) Lena Delta could be defined by this classification approach. The mean daily methane emission of the entire Lena Delta was calculated with 10.35 mg CH4/m**2/d. Taking our multi-scale approach into account we find that the methane source strength of certain tundra wetland types is lower than calculated previously on coarser scales.
Resumo:
The purpose of this study was to test Lotka’s law of scientific publication productivity using the methodology outlined by Pao (1985), in the field of Library and Information Studies (LIS). Lotka’s law has been sporadically tested in the field over the past 30+ years, but the results of these studies are inconclusive due to the varying methods employed by the researchers. A data set of 1,856 citations that were found using the ISI Web of Knowledge databases were studied. The values of n and c were calculated to be 2.1 and 0.6418 (64.18%) respectively. The Kolmogorov-Smirnov (K-S) one sample goodness-of-fit test was conducted at the 0.10 level of significance. The Dmax value is 0.022758 and the calculated critical value is 0.026562. It was determined that the null hypothesis stating that there is no difference in the observed distribution of publications and the distribution obtained using Lotka’s and Pao’s procedure could not be rejected. This study finds that literature in the field of library and Information Studies does conform to Lotka’s law with reliable results. As result, Lotka’s law can be used in LIS as a standardized means of measuring author publication productivity which will lead to findings that are comparable on many levels (e.g., department, institution, national). Lotka’s law can be employed as an empirically proven analytical tool to establish publication productivity benchmarks for faculty and faculty librarians. Recommendations for further study include (a) exploring the characteristics of the high and low producers; (b) finding a way to successfully account for collaborative contributions in the formula; and, (c) a detailed study of institutional policies concerning publication productivity and its impact on the appointment, tenure and promotion process of academic librarians.
Resumo:
Big Data Analytics is an emerging field since massive storage and computing capabilities have been made available by advanced e-infrastructures. Earth and Environmental sciences are likely to benefit from Big Data Analytics techniques supporting the processing of the large number of Earth Observation datasets currently acquired and generated through observations and simulations. However, Earth Science data and applications present specificities in terms of relevance of the geospatial information, wide heterogeneity of data models and formats, and complexity of processing. Therefore, Big Earth Data Analytics requires specifically tailored techniques and tools. The EarthServer Big Earth Data Analytics engine offers a solution for coverage-type datasets, built around a high performance array database technology, and the adoption and enhancement of standards for service interaction (OGC WCS and WCPS). The EarthServer solution, led by the collection of requirements from scientific communities and international initiatives, provides a holistic approach that ranges from query languages and scalability up to mobile access and visualization. The result is demonstrated and validated through the development of lighthouse applications in the Marine, Geology, Atmospheric, Planetary and Cryospheric science domains.
Resumo:
Big Data Analytics is an emerging field since massive storage and computing capabilities have been made available by advanced e-infrastructures. Earth and Environmental sciences are likely to benefit from Big Data Analytics techniques supporting the processing of the large number of Earth Observation datasets currently acquired and generated through observations and simulations. However, Earth Science data and applications present specificities in terms of relevance of the geospatial information, wide heterogeneity of data models and formats, and complexity of processing. Therefore, Big Earth Data Analytics requires specifically tailored techniques and tools. The EarthServer Big Earth Data Analytics engine offers a solution for coverage-type datasets, built around a high performance array database technology, and the adoption and enhancement of standards for service interaction (OGC WCS and WCPS). The EarthServer solution, led by the collection of requirements from scientific communities and international initiatives, provides a holistic approach that ranges from query languages and scalability up to mobile access and visualization. The result is demonstrated and validated through the development of lighthouse applications in the Marine, Geology, Atmospheric, Planetary and Cryospheric science domains.
Resumo:
Encryption of personal data is widely regarded as a privacy preserving technology which could potentially play a key role for the compliance of innovative IT technology within the European data protection law framework. Therefore, in this paper, we examine the new EU General Data Protection Regulation’s relevant provisions regarding encryption – such as those for anonymisation and pseudonymisation – and assess whether encryption can serve as an anonymisation technique, which can lead to the non-applicability of the GDPR. However, the provisions of the GDPR regarding the material scope of the Regulation still leave space for legal uncertainty when determining whether a data subject is identifiable or not. Therefore, we inter alia assess the Opinion of the Advocate General of the European Court of Justice (ECJ) regarding a preliminary ruling on the interpretation of the dispute concerning whether a dynamic IP address can be considered as personal data, which may put an end to the dispute whether an absolute or a relative approach has to be used for the assessment of the identifiability of data subjects. Furthermore, we outline the issue of whether the anonymisation process itself constitutes a further processing of personal data which needs to have a legal basis in the GDPR. Finally, we give an overview of relevant encryption techniques and examine their impact upon the GDPR’s material scope.
Resumo:
BACKGROUND: Invasive meningococcal disease is a significant cause of mortality and morbidity in the UK. Administration of chemoprophylaxis to close contacts reduces the risk of a secondary case. However, unnecessary chemoprophylaxis may be associated with adverse reactions, increased antibiotic resistance and removal of organisms, such as Neisseria lactamica, which help to protect against meningococcal disease. Limited evidence exists to suggest that overuse of chemoprophylaxis may occur. This study aimed to evaluate prescribing of chemoprophylaxis for contacts of meningococcal disease by general practitioners and hospital staff. METHODS: Retrospective case note review of cases of meningococcal disease was conducted in one health district from 1st September 1997 to 31st August 1999. Routine hospital and general practitioner prescribing data was searched for chemoprophylactic prescriptions of rifampicin and ciprofloxacin. A questionnaire of general practitioners was undertaken to obtain more detailed information. RESULTS: Prescribing by hospital doctors was in line with recommendations by the Consultant for Communicable Disease Control. General practitioners prescribed 118% more chemoprophylaxis than was recommended. Size of practice and training status did not affect the level of additional prescribing, but there were significant differences by geographical area. The highest levels of prescribing occurred in areas with high disease rates and associated publicity. However, some true close contacts did not appear to receive prophylaxis. CONCLUSIONS: Receipt of chemoprophylaxis is affected by a series of patient, doctor and community interactions. High publicity appears to increase demand for prophylaxis. Some true contacts do not receive appropriate chemoprophylaxis and are left at an unnecessarily increased risk
Resumo:
The key to graduate professionals with sufficient capacity to meet the research demands of users, should be the vision and the commissioning of schools and their academic library. All these efforts must be linked systematically to ensure the use of data recorded on their knowledge and information units.
Resumo:
Operational approaches have been more and more widely developed and used for providing marine data and information services for different socio-economic sectors of the Blue Growth and to advance knowledge about the marine environment. The objective of operational oceanographic research is to develop and improve the efficiency, timeliness, robustness and product quality of this approach. This white paper aims to address key scientific challenges and research priorities for the development of operational oceanography in Europe for the next 5-10 years. Knowledge gaps and deficiencies are identified in relation to common scientific challenges in four EuroGOOS knowledge areas: European Ocean Observations, Modelling and Forecasting Technology, Coastal Operational Oceanography and Operational Ecology. The areas "European Ocean Observations" and "Modelling and Forecasting Technology" focus on the further advancement of the basic instruments and capacities for European operational oceanography, while "Coastal Operational Oceanography" and "Operational Ecology" aim at developing new operational approaches for the corresponding knowledge areas.