929 resultados para Personal uses of computer
Resumo:
Objective To study student and staff views of the role and use of handouts, note-taking and overhead transparencies in veterinary science lectures at the University of Queensland Methods The Nominal Group Technique was used to help develop a questionnaire, which was completed by 351 students (a response rate of 84%) and 35 staff (76%) from the 5 years of the veterinary course. The data were analysed using the SAS statistical computer package. Results Staff and students held different views as to the frequency with which handouts should be used, their educational value, and whether they should be complete or partial. Fewer students than staff agreed that handouts discourage further reading in a subject. Almost all staff and students saw the central functions of note-taking to be provision of notes for subsequent revision and encoding information given by the lecturer. More students than staff however, considered that note-taking in lectures interferes with understanding. Staff and students held similar views as to the uses of overheads in lectures. Interestingly however, more staff than students agreed that overheads often contain too much information. Conclusion Both students and staff saw the central role of note-taking as providing a set of good notes for revision. Generally students preferred that this information be provided in the form of partial or complete handouts, while staff preferred students to take notes and to read outside lectures. Surprisingly, more staff than students felt that overhead transparencies often contained too much information. Note-taking, handouts and overhead transparencies need to be linked in a coherent educational strategy to promote effective learning.
Resumo:
This study integrated the research streams of computer-mediated communication (CMC) and group conflict by comparing the expression of different types of conflict in CMC groups and face-to face (FTF) groups over time. The main aim of the study was to compare the cues-filtered-out approach against the social information processing theory A laboratory study was conducted with 39 groups (19 CMC and 20 FTF) in which members were required to work together over three sessions. The frequencies of task, process, and relationship conflict were analyzed. Findings supported the social information processing theory. There was more process and relationship conflict in CMC groups compared to FTF groups on Day 1. However, this difference disappeared on Days 2 and 3. There was no difference between CMC and FTF groups in the amount of task conflict expressed on any day.
Resumo:
In the past few years the so-called gadgets like cellular phones, personal data assistants and digital cameras are more widespread even with less technological aware users. However, for several reasons, the factory-floor itself seems to be hermetic to this changes ... After the fieldbus revolution, the factory-floor has seen an increased use of more and more powerful programmable logic controllers and user interfaces but the way they are used remains almost the same. We believe that new user-computer interaction techniques including multimedia and augmented rcaliry combined with now affordable technologies like wearable computers and wireless networks can change the way the factory personal works together with the roachines and the information system on the factory-floor. This new age is already starting with innovative uses of communication networks on the factory-floor either using "standard" networks or enhancing industrial networks with multimedia and wireless capabilities.
Resumo:
Teaching and learning computer programming is as challenging as difficult. Assessing the work of students and providing individualised feedback to all is time-consuming and error prone for teachers and frequently involves a time delay. The existent tools and specifications prove to be insufficient in complex evaluation domains where there is a greater need to practice. At the same time Massive Open Online Courses (MOOC) are appearing revealing a new way of learning, more dynamic and more accessible. However this new paradigm raises serious questions regarding the monitoring of student progress and its timely feedback. This paper provides a conceptual design model for a computer programming learning environment. This environment uses the portal interface design model gathering information from a network of services such as repositories and program evaluators. The design model includes also the integration with learning management systems, a central piece in the MOOC realm, endowing the model with characteristics such as scalability, collaboration and interoperability. This model is not limited to the domain of computer programming and can be adapted to any complex area that requires systematic evaluation with immediate feedback.
Resumo:
Nepal has a long history of medical radiology since1923 but unfortunately, we still do not have any Radiation Protection Infrastructure to control the use of ionizing radiations in the various fields. The objective of this study was an assessment of the radiation protection in medical uses of ionizing radiation. Twenty-eight hospitals with diagnostic radiology facility were chosen for this study according to patient loads, equipment and working staffs. Radiation surveys were also done at five different radiotherapy centers. Questionnaire for radiation workers were used; radiation dose levels were measured and an inventory of availability of radiation equipment made. A corollary objective of the study was to create awareness in among workers on possible radiation health hazard and risk. It was also deemed important to know the level of understanding of the radiation workers in order to initiate steps towards the establishment of Nepalese laws, regulation and code of radiological practice in this field. Altogether, 203 Radiation workers entertained the questionnaire, out of which 41 are from the Radiotherapy and 162 are from diagnostic radiology. The radiation workers who have participated in the questionnaire represent more than 50% of the radiation workers working in this field in Nepal. Almost all X-ray, CT and Mammogram installations were built according to protection criteria and hence found safe. Radiation dose level at the reference points for all the five Radiotherapy centers are within safe limit. Around 65% of the radiation workers have never been monitored for radiation. There is no quality control program in any of the surveyed hospitals except radiotherapy facilities.
Resumo:
The emergence of open source software in the last years has become a common topic of study in different fields, from the most technical characteristics to the economical aspects. This paper examines the current status about the literature dealing with economics of open source and explores the uses, infrastructure and expectations of retail businesses and institutions of the town of Igualda about it. This qualitative case study finds out that the current equipment and level of uses of ICTs are low and that the current situation of the town stores is receptive to a potential introduction of open source software.
Resumo:
BACKGROUND Functional brain images such as Single-Photon Emission Computed Tomography (SPECT) and Positron Emission Tomography (PET) have been widely used to guide the clinicians in the Alzheimer's Disease (AD) diagnosis. However, the subjectivity involved in their evaluation has favoured the development of Computer Aided Diagnosis (CAD) Systems. METHODS It is proposed a novel combination of feature extraction techniques to improve the diagnosis of AD. Firstly, Regions of Interest (ROIs) are selected by means of a t-test carried out on 3D Normalised Mean Square Error (NMSE) features restricted to be located within a predefined brain activation mask. In order to address the small sample-size problem, the dimension of the feature space was further reduced by: Large Margin Nearest Neighbours using a rectangular matrix (LMNN-RECT), Principal Component Analysis (PCA) or Partial Least Squares (PLS) (the two latter also analysed with a LMNN transformation). Regarding the classifiers, kernel Support Vector Machines (SVMs) and LMNN using Euclidean, Mahalanobis and Energy-based metrics were compared. RESULTS Several experiments were conducted in order to evaluate the proposed LMNN-based feature extraction algorithms and its benefits as: i) linear transformation of the PLS or PCA reduced data, ii) feature reduction technique, and iii) classifier (with Euclidean, Mahalanobis or Energy-based methodology). The system was evaluated by means of k-fold cross-validation yielding accuracy, sensitivity and specificity values of 92.78%, 91.07% and 95.12% (for SPECT) and 90.67%, 88% and 93.33% (for PET), respectively, when a NMSE-PLS-LMNN feature extraction method was used in combination with a SVM classifier, thus outperforming recently reported baseline methods. CONCLUSIONS All the proposed methods turned out to be a valid solution for the presented problem. One of the advances is the robustness of the LMNN algorithm that not only provides higher separation rate between the classes but it also makes (in combination with NMSE and PLS) this rate variation more stable. In addition, their generalization ability is another advance since several experiments were performed on two image modalities (SPECT and PET).
Resumo:
Malposition of the acetabular component during hip arthroplasty increases the occurrence of impingement, reduces range of motion, and increases the risk of dislocation and long-term wear. To prevent malpositioned hip implants, an increasing number of computer-assisted orthopaedic systems have been described, but their accuracy is not well established. The purpose of this study was to determine the reproducibility and accuracy of conventional versus computer-assisted techniques for positioning the acetabular component in total hip arthroplasty. Using a lateral approach, 150 cups were placed by 10 surgeons in 10 identical plastic pelvis models (freehand, with a mechanical guide, using computer assistance). Conditions for cup implantations were made to mimic the operating room situation. Preoperative planning was done from a computed tomography scan. The accuracy of cup abduction and anteversion was assessed with an electromagnetic system. Freehand placement revealed a mean accuracy of cup anteversion and abduction of 10 degrees and 3.5 degrees, respectively (maximum error, 35 degrees). With the cup positioner, these angles measured 8 degrees and 4 degrees (maximum error, 29.8 degrees), respectively, and using computer assistance, 1.5 degrees and 2.5 degrees degrees (maximum error, 8 degrees), respectively. Computer-assisted cup placement was an accurate and reproducible technique for total hip arthroplasty. It was more accurate than traditional methods of cup positioning.
Resumo:
This paper presents a pattern recognition method focused on paintings images. The purpose is construct a system able to recognize authors or art styles based on common elements of his work (here called patterns). The method is based on comparing images that contain the same or similar patterns. It uses different computer vision techniques, like SIFT and SURF, to describe the patterns in descriptors, K-Means to classify and simplify these descriptors, and RANSAC to determine and detect good results. The method are good to find patterns of known images but not so good if they are not.
Resumo:
The optimization of the pilot overhead in single-user wireless fading channels is investigated, and the dependence of this overhead on various system parameters of interest (e.g., fading rate, signal-to-noise ratio) is quantified. The achievable pilot-based spectral efficiency is expanded with respect to the fading rate about the no-fading point, which leads to an accurate order expansion for the pilot overhead. This expansion identifies that the pilot overhead, as well as the spectral efficiency penalty with respect to a reference system with genie-aided CSI (channel state information) at the receiver, depend on the square root of the normalized Doppler frequency. It is also shown that the widely-used block fading model is a special case of more accurate continuous fading models in terms of the achievable pilot-based spectral efficiency. Furthermore, it is established that the overhead optimization for multiantenna systems is effectively the same as for single-antenna systems with the normalized Doppler frequency multiplied by the number of transmit antennas.
Resumo:
Tone Mapping is the problem of compressing the range of a High-Dynamic Range image so that it can be displayed in a Low-Dynamic Range screen, without losing or introducing novel details: The final image should produce in the observer a sensation as close as possible to the perception produced by the real-world scene. We propose a tone mapping operator with two stages. The first stage is a global method that implements visual adaptation, based on experiments on human perception, in particular we point out the importance of cone saturation. The second stage performs local contrast enhancement, based on a variational model inspired by color vision phenomenology. We evaluate this method with a metric validated by psychophysical experiments and, in terms of this metric, our method compares very well with the state of the art.
Resumo:
This paper applies random matrix theory to obtain analytical characterizations of the capacity of correlated multiantenna channels. The analysis is not restricted to the popular separable correlation model, but rather it embraces a more general representation that subsumesmost of the channel models that have been treated in the literature. For arbitrary signal-to-noise ratios (SNR), the characterization is conducted in the regime of large numbers of antennas. For the low- and high-SNR regions, in turn, we uncover compact capacity expansions that are valid for arbitrary numbers of antennas and that shed insight on how antenna correlation impacts the tradeoffs between power, bandwidth and rate.
Resumo:
Expressions relating spectral efficiency, power, and Doppler spectrum, are derived for Rayleigh-faded wireless channels with Gaussian signal transmission. No side information on the state of the channel is assumed at the receiver. Rather, periodic reference signals are postulated in accordance with the functioning of most wireless systems. The analysis relies on a well-established lower bound, generally tight and asymptotically exact at low SNR. In contrast with most previous studies, which relied on block-fading channel models, a continuous-fading model is adopted. This embeds the Doppler spectrum directly in the derived expressions, imbuing them with practical significance. Closed-form relationships are obtained for the popular Clarke-Jakes spectrum and informative expansions, valid for arbitrary spectra, are found for the low- and high-power regimes. While the paper focuses on scalar channels, the extension to multiantenna settings is also discussed.
Resumo:
Supported by IEEE 802.15.4 standardization activities, embedded networks have been gaining popularity in recent years. The focus of this paper is to quantify the behavior of key networking metrics of IEEE 802.15.4 beacon-enabled nodes under typical operating conditions, with the inclusion of packet retransmissions. We corrected and extended previous analyses by scrutinizing the assumptions on which the prevalent Markovian modeling is generally based. By means of a comparative study, we singled out which of the assumptions impact each of the performance metrics (throughput, delay, power consumption, collision probability, and packet-discard probability). In particular, we showed that - unlike what is usually assumed - the probability that a node senses the channel busy is not constant for all the stages of the backoff procedure and that these differences have a noticeable impact on backoff delay, packet-discard probability, and power consumption. Similarly, we showed that - again contrary to common assumption - the probability of obtaining transmission access to the channel depends on the number of nodes that is simultaneously sensing it. We evidenced that ignoring this dependence has a significant impact on the calculated values of throughput and collision probability. Circumventing these and other assumptions, we rigorously characterize, through a semianalytical approach, the key metrics in a beacon-enabled IEEE 802.15.4 system with retransmissions.
Resumo:
Intuitively, music has both predictable and unpredictable components. In this work we assess this qualitative statement in a quantitative way using common time series models fitted to state-of-the-art music descriptors. These descriptors cover different musical facets and are extracted from a large collection of real audio recordings comprising a variety of musical genres. Our findings show that music descriptor time series exhibit a certain predictability not only for short time intervals, but also for mid-term and relatively long intervals. This fact is observed independently of the descriptor, musical facet and time series model we consider. Moreover, we show that our findings are not only of theoretical relevance but can also have practical impact. To this end we demonstrate that music predictability at relatively long time intervals can be exploited in a real-world application, namely the automatic identification of cover songs (i.e. different renditions or versions of the same musical piece). Importantly, this prediction strategy yields a parameter-free approach for cover song identification that is substantially faster, allows for reduced computational storage and still maintains highly competitive accuracies when compared to state-of-the-art systems.