799 resultados para Management - Data processing - Study and teaching (Higher) - Victoria


Relevância:

100.00% 100.00%

Publicador:

Resumo:

This thesis describes advances in the characterisation, calibration and data processing of optical coherence tomography (OCT) systems. Femtosecond (fs) laser inscription was used for producing OCT-phantoms. Transparent materials are generally inert to infra-red radiations, but with fs lasers material modification occurs via non-linear processes when the highly focused light source interacts with the materials. This modification is confined to the focal volume and is highly reproducible. In order to select the best inscription parameters, combination of different inscription parameters were tested, using three fs laser systems, with different operating properties, on a variety of materials. This facilitated the understanding of the key characteristics of the produced structures with the aim of producing viable OCT-phantoms. Finally, OCT-phantoms were successfully designed and fabricated in fused silica. The use of these phantoms to characterise many properties (resolution, distortion, sensitivity decay, scan linearity) of an OCT system was demonstrated. Quantitative methods were developed to support the characterisation of an OCT system collecting images from phantoms and also to improve the quality of the OCT images. Characterisation methods include the measurement of the spatially variant resolution (point spread function (PSF) and modulation transfer function (MTF)), sensitivity and distortion. Processing of OCT data is a computer intensive process. Standard central processing unit (CPU) based processing might take several minutes to a few hours to process acquired data, thus data processing is a significant bottleneck. An alternative choice is to use expensive hardware-based processing such as field programmable gate arrays (FPGAs). However, recently graphics processing unit (GPU) based data processing methods have been developed to minimize this data processing and rendering time. These processing techniques include standard-processing methods which includes a set of algorithms to process the raw data (interference) obtained by the detector and generate A-scans. The work presented here describes accelerated data processing and post processing techniques for OCT systems. The GPU based processing developed, during the PhD, was later implemented into a custom built Fourier domain optical coherence tomography (FD-OCT) system. This system currently processes and renders data in real time. Processing throughput of this system is currently limited by the camera capture rate. OCTphantoms have been heavily used for the qualitative characterization and adjustment/ fine tuning of the operating conditions of OCT system. Currently, investigations are under way to characterize OCT systems using our phantoms. The work presented in this thesis demonstrate several novel techniques of fabricating OCT-phantoms and accelerating OCT data processing using GPUs. In the process of developing phantoms and quantitative methods, a thorough understanding and practical knowledge of OCT and fs laser processing systems was developed. This understanding leads to several novel pieces of research that are not only relevant to OCT but have broader importance. For example, extensive understanding of the properties of fs inscribed structures will be useful in other photonic application such as making of phase mask, wave guides and microfluidic channels. Acceleration of data processing with GPUs is also useful in other fields.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

This paper deals with communicational breakdowns and misunderstandings in computer mediated communication (CMC) and ways to recover from them or to prevent them. The paper describes a case study of CMC conducted in a company named Artigiani. We observed communication and conducted content analysis of e-mail messages, focusing on message exchanges between customer service representatives (CSRs) and their contacts. In addition to task management difficulties, we identified communication breakdowns that result from differences between perspectives, and from the lack of contextual information, mainly technical background and professional jargon at the customers’ side. We examined possible ways to enhance CMC and accordingly designed a prototype for an e-mail user interface that emphasizes a communicational strategy called contextualization as a central component for obtaining effective communication and for supporting effective management and control of organizational activities, especially handling orders, price quoting, and monitoring the supply and installation of products.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Introduction: The research and teaching of French linguistics in UK higher education (HE) institutions have a venerable history; a number of universities have traditionally offered philology or history of the language courses, which complement literary study. A deeper understanding of the way that the phonology, syntax and semantics of the French language have evolved gives students linguistic insights that dovetail with their study of the Roman de Renart, Rabelais, Racine or the nouveau roman. There was, in the past, some coverage of contemporary French phonetics but little on sociolinguistic issues. More recently, new areas of research and teaching have been developed, with a particular focus on contemporary spoken French and on sociolinguistics. Well supported by funding councils, UK researchers are also making an important contribution in other areas: phonetics and phonology, syntax, pragmatics and second-language acquisition. A fair proportion of French linguistics research occurs outside French sections in psychology or applied linguistics departments. In addition, the UK plays a particular role in bringing together European and North American intellectual traditions and methodologies and in promoting the internationalisation of French linguistics research through the strength of its subject associations, and that of the Journal of French Language Studies. The following sections treat each of these areas in turn. History of the French Language There is a long and distinguished tradition in Britain of teaching and research on the history of the French language, particularly, but by no means exclusively, at the universities of Cambridge, Manchester and Oxford.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The purpose of this study was to identify the needed competencies of a Recreational Foodservice manager. A three round Delphi method of iteration was used. Delphi is a research method that utilizes iterating rounds to elicit the opinion of a panel of experts regarding a specific subject.^ A nominating committee of 22 industry leaders was consulted to establish a panel of 40 management experts, of which 35 (87.5%) completed all three rounds of the Delphi study.^ Round One of the study identified 17 specific job functions of a Recreational Foodservice manager. The researcher prepared an instrument detailing 60 competencies derived from an analysis of Round One results and distributed it as a Round Two instrument requesting the panel opinion regarding the relative importance of each listed competency on a five point Likert scale.^ The results of Round Two were tabulated and analyzed to ascertain areas of consensus. A Round Three instrument was prepared advising panelists of all areas of consensus, their dissenting opinions, if any, and a request for a revised opinion.^ A final report was prepared listing the 60 competencies and the panel opinion that eight were of highest priority, 29 of above average priority, and 23 of average priority. No item received two other available ratings, below average priority and lowest priority.^ The implications of these findings suggest necessary areas of curriculum development and industry management development to implement professionalism for Recreational Foodservice managers. ^

Relevância:

100.00% 100.00%

Publicador:

Resumo:

As massive data sets become increasingly available, people are facing the problem of how to effectively process and understand these data. Traditional sequential computing models are giving way to parallel and distributed computing models, such as MapReduce, both due to the large size of the data sets and their high dimensionality. This dissertation, as in the same direction of other researches that are based on MapReduce, tries to develop effective techniques and applications using MapReduce that can help people solve large-scale problems. Three different problems are tackled in the dissertation. The first one deals with processing terabytes of raster data in a spatial data management system. Aerial imagery files are broken into tiles to enable data parallel computation. The second and third problems deal with dimension reduction techniques that can be used to handle data sets of high dimensionality. Three variants of the nonnegative matrix factorization technique are scaled up to factorize matrices of dimensions in the order of millions in MapReduce based on different matrix multiplication implementations. Two algorithms, which compute CANDECOMP/PARAFAC and Tucker tensor decompositions respectively, are parallelized in MapReduce based on carefully partitioning the data and arranging the computation to maximize data locality and parallelism.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

International travel has significant implications on the study of architecture. This study analyzed ways in which undergraduate and graduate students benefited from the experience of international travel and study abroad. Taken from the perspective of 15 individuals who were currently or had been architecture students at the University of Miami and Florida International University or who were alumni of the University of Florida and Syracuse University, the research explored how international travel and study abroad enhanced their awareness and understanding of architecture, and how it complemented their architecture curricula. This study also addressed a more personal aspect of international travel in order to learn how the experience and exposure to foreign cultures had positively influenced the personal and professional development of the participants.^ Participants’ individual and two-person semi-structured interviews about study abroad experiences were electronically recorded and transcribed for analysis. A second interview was conducted with five of the participants to obtain feedback concerning the accuracy of the transcripts and the interpretation of the data. Sketch journals and design projects were also analyzed from five participants and used as data for the purposes of better understanding what these individuals learned and experienced as part of their study abroad.^ Findings indicated that study abroad experiences helped to broaden student understanding about architecture and urban development. These experiences also opened the possibilities of creative and professional expression. For many, this was the most important aspect of their education as architects because it heightened their interest in architecture. These individuals talked about how they had the opportunity to experience contemporary and ancient buildings that they had learned about in their history and design classes on their home campuses. In terms of personal and professional development, many of the participants remarked that they became more independent and self-reliant because of their study abroad experiences. They also displayed a sense of global awareness and were interested in the cultures of their host nations. The study abroad experiences also had a lasting influence on their professional development.^

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Companies have long recognized the importance of training and developing their managers to prepare them for their short- and long-term careers. Formal management-development programs and other less formal means of management development abound in the hospitality industry. Therefore, one may ask whether the entry-level managers for whom these programs are designed perceive them to be effective. The present study explores management-development practices, procedures, and techniques, and their effects on job satisfaction and organizational commitment

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Background: Biologists often need to assess whether unfamiliar datasets warrant the time investment required for more detailed exploration. Basing such assessments on brief descriptions provided by data publishers is unwieldy for large datasets that contain insights dependent on specific scientific questions. Alternatively, using complex software systems for a preliminary analysis may be deemed as too time consuming in itself, especially for unfamiliar data types and formats. This may lead to wasted analysis time and discarding of potentially useful data. Results: We present an exploration of design opportunities that the Google Maps interface offers to biomedical data visualization. In particular, we focus on synergies between visualization techniques and Google Maps that facilitate the development of biological visualizations which have both low-overhead and sufficient expressivity to support the exploration of data at multiple scales. The methods we explore rely on displaying pre-rendered visualizations of biological data in browsers, with sparse yet powerful interactions, by using the Google Maps API. We structure our discussion around five visualizations: a gene co-regulation visualization, a heatmap viewer, a genome browser, a protein interaction network, and a planar visualization of white matter in the brain. Feedback from collaborative work with domain experts suggests that our Google Maps visualizations offer multiple, scale-dependent perspectives and can be particularly helpful for unfamiliar datasets due to their accessibility. We also find that users, particularly those less experienced with computer use, are attracted by the familiarity of the Google Maps API. Our five implementations introduce design elements that can benefit visualization developers. Conclusions: We describe a low-overhead approach that lets biologists access readily analyzed views of unfamiliar scientific datasets. We rely on pre-computed visualizations prepared by data experts, accompanied by sparse and intuitive interactions, and distributed via the familiar Google Maps framework. Our contributions are an evaluation demonstrating the validity and opportunities of this approach, a set of design guidelines benefiting those wanting to create such visualizations, and five concrete example visualizations.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

With the advent of peer to peer networks, and more importantly sensor networks, the desire to extract useful information from continuous and unbounded streams of data has become more prominent. For example, in tele-health applications, sensor based data streaming systems are used to continuously and accurately monitor Alzheimer's patients and their surrounding environment. Typically, the requirements of such applications necessitate the cleaning and filtering of continuous, corrupted and incomplete data streams gathered wirelessly in dynamically varying conditions. Yet, existing data stream cleaning and filtering schemes are incapable of capturing the dynamics of the environment while simultaneously suppressing the losses and corruption introduced by uncertain environmental, hardware, and network conditions. Consequently, existing data cleaning and filtering paradigms are being challenged. This dissertation develops novel schemes for cleaning data streams received from a wireless sensor network operating under non-linear and dynamically varying conditions. The study establishes a paradigm for validating spatio-temporal associations among data sources to enhance data cleaning. To simplify the complexity of the validation process, the developed solution maps the requirements of the application on a geometrical space and identifies the potential sensor nodes of interest. Additionally, this dissertation models a wireless sensor network data reduction system by ascertaining that segregating data adaptation and prediction processes will augment the data reduction rates. The schemes presented in this study are evaluated using simulation and information theory concepts. The results demonstrate that dynamic conditions of the environment are better managed when validation is used for data cleaning. They also show that when a fast convergent adaptation process is deployed, data reduction rates are significantly improved. Targeted applications of the developed methodology include machine health monitoring, tele-health, environment and habitat monitoring, intermodal transportation and homeland security.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Abstract and faculty adviser information are not available for this thesis.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Few valid and reliable placement procedures are available to assess the English language proficiency of adults who enroll in English for Speakers of Other Languages (ESOL) programs. Whereas placement material exists for children and university ESOL students, the needs of students in adult community education programs have not been adequately addressed. Furthermore, the research suggests that a number of variables, such as, native language, age, prior schooling, length of residence, and employment are related to second language acquisition. Numerous studies contribute to our understanding of the relationship of these factors to second language acquisition of Spanish-speaking students. Again, there is a void in the research investigating the factors affecting second language acquisition and consequently, appropriate placement of Haitian Creole-speaking students. This study compared a standardized instrument, the NYS Place Test, used alone and in combination with a writing sample in English, to subjective judgement of a department coordinator for initial placement of Haitian adult ESOL students in a community education program. The study also investigated whether or not consideration of student profile data improved the accuracy of the test. Finally, the study sought to determine if a relationship existed between student profile data and those who withdrew from the program or did not enter a class after registering. Analysis of the data by crosstabulation and chi-square revealed that the standardized NYS Place Test was at least as accurate as subjective department coordinator placement and that one procedure could be substituted for li other. Although the writing sample in English improved accuracy of placement by the NYS test, the results were not significant. Of the profile variables, only length of residence was found to be significantly related to accuracy of placement using the NYS Place Test. The number of incorrect placements was higher for those students who lived in the host country from twenty-five to one hundred ten months. A post hoc analysis of NYS test scores according to level showed that those learners who placed in level three also had a significantly higher incidence of incorrect placements. No significant relationship was observed between the profile variables and those who withdrew from the program or registered but did not enter a class.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Funding Sources The NNUH Stroke and TIA Register is maintained by the NNUH NHS Foundation Trust Stroke Services and data management for this study is supported by the NNUH Research and Development Department through Research Capability Funds.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Much has been written about Big Data from a technical, economical, juridical and ethical perspective. Still, very little empirical and comparative data is available on how Big Data is approached and regulated in Europe and beyond. This contribution makes a first effort to fill that gap by presenting the reactions to a survey on Big Data from the Data Protection Authorities of fourteen European countries and a comparative legal research of eleven countries. This contribution presents those results, addressing 10 challenges for the regulation of Big Data.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The generation of heterogeneous big data sources with ever increasing volumes, velocities and veracities over the he last few years has inspired the data science and research community to address the challenge of extracting knowledge form big data. Such a wealth of generated data across the board can be intelligently exploited to advance our knowledge about our environment, public health, critical infrastructure and security. In recent years we have developed generic approaches to process such big data at multiple levels for advancing decision-support. It specifically concerns data processing with semantic harmonisation, low level fusion, analytics, knowledge modelling with high level fusion and reasoning. Such approaches will be introduced and presented in context of the TRIDEC project results on critical oil and gas industry drilling operations and also the ongoing large eVacuate project on critical crowd behaviour detection in confined spaces.