817 resultados para Computer-Assisted Proofs


Relevância:

80.00% 80.00%

Publicador:

Resumo:

The attitudes of 328 British Secondary School children towards computers were examined in a cross-sectional survey. Measures of both general attitudes towards computers and affective reactions towards working with computers were examined in relation to the sex of the subject, courses studied (computer related/noncomputer related) and availability of a home computer. A differential pattern of results was observed. With respect to general attitudes towards computers, main effects were found for all three independent variables indicating that more favourable attitudes increased as a function of being male, doing computer courses and having a home computer. In contrast to this, affective reactions to working with computers was primarily related to doing computer courses, such that those doing computer courses reported more positive and less negative reactions. The practical and theoretical implications of these results are discussed.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

An approach to a specialized website creation – club of distance courses authors – on the basis of Virtual Learning Space “Web-Class KhPI” is implemented and suggested in the article.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

In this work, we determine the coset weight spectra of all binary cyclic codes of lengths up to 33, ternary cyclic and negacyclic codes of lengths up to 20 and of some binary linear codes of lengths up to 33 which are distance-optimal, by using some of the algebraic properties of the codes and a computer assisted search. Having these weight spectra the monotony of the function of the undetected error probability after t-error correction P(t)ue (C,p) could be checked with any precision for a linear time. We have used a programm written in Maple to check the monotony of P(t)ue (C,p) for the investigated codes for a finite set of points of p € [0, p/(q-1)] and in this way to determine which of them are not proper.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

В статье рассматривается сценарный подход для определения количественной оценки эргономичности интерфейса обучающих систем. Описаны метод декомпозиции и метод сценарной композиции.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Studies of framing in the EU political system are still a rarity and they suffer from a lack of systematic empirical analysis. Addressing this gap, we ask if institutional and policy contexts intertwined with the strategic side of framing can explain the number and types of frames employed by different stakeholders. We use a computer-assisted manual content analysis and develop a fourfold typology of frames to study the frames that were prevalent in the debates on four EU policy proposals within financial market regulation and environmental policy at the EU level and in Germany, Sweden, the Netherlands and the United Kingdom. The main empirical finding is that both contexts and strategies exert a significant impact on the number and types of frames in EU policy debates. In conceptual terms, the article contributes to developing more fine-grained tools for studying frames and their underlying dimensions.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Framing plays an important role in public policy. Interest groups strategically highlight some aspects of a policy proposal while downplaying others in order to steer the policy debate in a favorable direction. Despite the importance of framing, we still know relatively little about the framing strategies of interest groups due to methodological difficulties that have prevented scholars from systematically studying interest group framing across a large number of interest groups and multiple policy debates. This article therefore provides an overview of three novel research methods that allow researchers to systematically measure interest group frames. More specifically, this article introduces a word-based quantitative text analysis technique, a manual, computer-assisted content analysis approach and face-to-face interviews designed to systematically identify interest group frames. The results generated by all three techniques are compared on the basis of a case study of interest group framing in an environmental policy debate in the European Union.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Social software is increasingly being used in higher and further education to support teaching and learning processes. These applications provide students with social and cognitive stimulation and also add to the interaction between students and educators. However, in addition to the benefits the introduction of social software into a course environment can also have adverse implications on students, educators and the education institution as a whole, a phenomenon which has received much less attention in the literature. In this study we explore the various implications of introducing social software into a course environment in order to identify the associated benefits, but also the potential drawbacks. We draw on data from 20 social software initiatives in UK based higher and further education institutions to identify the diverse experiences and concerns of students and educators. The findings are presented in form of a SWOT analysis, which allows us to better understand the otherwise ambiguous implications of social software in terms of its strengths, weaknesses, opportunities and threats. From the analysis we have derived concrete recommendations for the use of social software as a teaching and learning tool.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

This study examined contextual and situational influences on older adults' decision to complete advance directives by means of a conceptual framework derived from symbolic interactionist theory and a cross-sectional, correlational research design. It was hypothesized that completion of advance directives among older adults would be associated with visiting or participating in the care of a terminally ill or permanently incompetent individual sustained by technology. Using a 53-item questionnaire, computer assisted telephone interviews (CATI) were conducted with 398 community dwelling adults between September and October 2003. Respondents were contacted using random-select dialing from a listed sample of 99% of household telephone numbers in one South Florida census tract. Over 90% of households in this tract include an individual age 65 or older. ^ The results revealed that contrary to most reports in the literature a substantial proportion of older adults (82%) had completed advance directives and that the link between older adults and document completion was mainly through attorneys and not mandated agents, health care professionals. Further, more than one third of older adults reported that religion/spirituality was not an important part of their life, suggesting that the recommended practice of offering religious/spiritual counseling to all those approaching death be reexamined. The hypothesis was not supported (p > .05) and is explained by the situational emphasis on the variables rather than on structural influences. In logistic regression analysis, only increasing age (p = .001) and higher education (p = < .001) were significant but explained only 10% of the variance in document completion. ^ Based on the findings, increased interdisciplinary collaboration is suggested with regard to the advance directive agenda. Since attorneys play a key role in document completion, other professions should seek their expertise and collaboration. In addition, the inclusion of a religious/spiritual preference section in all living wills should be considered as an essential part of a holistic and individually appropriate document. Implications for social work education, practice, and advocacy are discussed as well as suggestions for further research. ^

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Respiratory gating in lung PET imaging to compensate for respiratory motion artifacts is a current research issue with broad potential impact on quantitation, diagnosis and clinical management of lung tumors. However, PET images collected at discrete bins can be significantly affected by noise as there are lower activity counts in each gated bin unless the total PET acquisition time is prolonged, so that gating methods should be combined with imaging-based motion correction and registration methods. The aim of this study was to develop and validate a fast and practical solution to the problem of respiratory motion for the detection and accurate quantitation of lung tumors in PET images. This included: (1) developing a computer-assisted algorithm for PET/CT images that automatically segments lung regions in CT images, identifies and localizes lung tumors of PET images; (2) developing and comparing different registration algorithms which processes all the information within the entire respiratory cycle and integrate all the tumor in different gated bins into a single reference bin. Four registration/integration algorithms: Centroid Based, Intensity Based, Rigid Body and Optical Flow registration were compared as well as two registration schemes: Direct Scheme and Successive Scheme. Validation was demonstrated by conducting experiments with the computerized 4D NCAT phantom and with a dynamic lung-chest phantom imaged using a GE PET/CT System. Iterations were conducted on different size simulated tumors and different noise levels. Static tumors without respiratory motion were used as gold standard; quantitative results were compared with respect to tumor activity concentration, cross-correlation coefficient, relative noise level and computation time. Comparing the results of the tumors before and after correction, the tumor activity values and tumor volumes were closer to the static tumors (gold standard). Higher correlation values and lower noise were also achieved after applying the correction algorithms. With this method the compromise between short PET scan time and reduced image noise can be achieved, while quantification and clinical analysis become fast and precise.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Three-Dimensional (3-D) imaging is vital in computer-assisted surgical planning including minimal invasive surgery, targeted drug delivery, and tumor resection. Selective Internal Radiation Therapy (SIRT) is a liver directed radiation therapy for the treatment of liver cancer. Accurate calculation of anatomical liver and tumor volumes are essential for the determination of the tumor to normal liver ratio and for the calculation of the dose of Y-90 microspheres that will result in high concentration of the radiation in the tumor region as compared to nearby healthy tissue. Present manual techniques for segmentation of the liver from Computed Tomography (CT) tend to be tedious and greatly dependent on the skill of the technician/doctor performing the task. ^ This dissertation presents the development and implementation of a fully integrated algorithm for 3-D liver and tumor segmentation from tri-phase CT that yield highly accurate estimations of the respective volumes of the liver and tumor(s). The algorithm as designed requires minimal human intervention without compromising the accuracy of the segmentation results. Embedded within this algorithm is an effective method for extracting blood vessels that feed the tumor(s) in order to plan effectively the appropriate treatment. ^ Segmentation of the liver led to an accuracy in excess of 95% in estimating liver volumes in 20 datasets in comparison to the manual gold standard volumes. In a similar comparison, tumor segmentation exhibited an accuracy of 86% in estimating tumor(s) volume(s). Qualitative results of the blood vessel segmentation algorithm demonstrated the effectiveness of the algorithm in extracting and rendering the vasculature structure of the liver. Results of the parallel computing process, using a single workstation, showed a 78% gain. Also, statistical analysis carried out to determine if the manual initialization has any impact on the accuracy showed user initialization independence in the results. ^ The dissertation thus provides a complete 3-D solution towards liver cancer treatment planning with the opportunity to extract, visualize and quantify the needed statistics for liver cancer treatment. Since SIRT requires highly accurate calculation of the liver and tumor volumes, this new method provides an effective and computationally efficient process required of such challenging clinical requirements.^

Relevância:

80.00% 80.00%

Publicador:

Resumo:

To date, hospitality management educators have struggled to modify generic software or adapt vendor-designed industry systems as a means of bringing hospitality information systems to the classroom. Specially- designed computer-based courseware can enhance learning while extending the boundaries of the traditional hospitality classroom. The author discusses the relevance of this software to the hospitality curriculum.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Electronic database handling of buisness information has gradually gained its popularity in the hospitality industry. This article provides an overview on the fundamental concepts of a hotel database and investigates the feasibility of incorporating computer-assisted data mining techniques into hospitality database applications. The author also exposes some potential myths associated with data mining in hospitaltiy database applications.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Respiratory gating in lung PET imaging to compensate for respiratory motion artifacts is a current research issue with broad potential impact on quantitation, diagnosis and clinical management of lung tumors. However, PET images collected at discrete bins can be significantly affected by noise as there are lower activity counts in each gated bin unless the total PET acquisition time is prolonged, so that gating methods should be combined with imaging-based motion correction and registration methods. The aim of this study was to develop and validate a fast and practical solution to the problem of respiratory motion for the detection and accurate quantitation of lung tumors in PET images. This included: (1) developing a computer-assisted algorithm for PET/CT images that automatically segments lung regions in CT images, identifies and localizes lung tumors of PET images; (2) developing and comparing different registration algorithms which processes all the information within the entire respiratory cycle and integrate all the tumor in different gated bins into a single reference bin. Four registration/integration algorithms: Centroid Based, Intensity Based, Rigid Body and Optical Flow registration were compared as well as two registration schemes: Direct Scheme and Successive Scheme. Validation was demonstrated by conducting experiments with the computerized 4D NCAT phantom and with a dynamic lung-chest phantom imaged using a GE PET/CT System. Iterations were conducted on different size simulated tumors and different noise levels. Static tumors without respiratory motion were used as gold standard; quantitative results were compared with respect to tumor activity concentration, cross-correlation coefficient, relative noise level and computation time. Comparing the results of the tumors before and after correction, the tumor activity values and tumor volumes were closer to the static tumors (gold standard). Higher correlation values and lower noise were also achieved after applying the correction algorithms. With this method the compromise between short PET scan time and reduced image noise can be achieved, while quantification and clinical analysis become fast and precise.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

We quantified pigment biomarkers by high performance liquid chromatography (HPLC) to obtain a broad taxonomic classification of microphytobenthos (MPB) (i.e. identification of dominant taxa). Three replicate sediment cores were collected at 0, 50 and 100 m along transects 5-9 in Heron Reef lagoon (n=15) (Fig. 1). Transects 1-4 could not be processed because the means to have the samples analysed by HPLC were not available at the time of field data collection. Cores were stored frozen and scrapes taken from the top of each one and placed in cryovials immersed in dry ice. Samples were sent to the laboratory (CSIRO Marine and Atmospheric Research, Hobart, Australia) where pigments were extracted with 100% acetone during fifteen hours at 4°C after vortex mixing (30 seconds) and sonication (15 minutes). Samples were then centrifuged and filtered prior to the analysis of pigment composition with a Waters - Alliance HPLC system equipped with a photo-diode array detector. Pigments were separated using a Zorbax Eclipse XDB-C8 stainless steel 150 mm x 4.6 mm ID column with 3.5 µm particle size (Agilent Technologies) and a binary gradient system with an elevated column temperature following a modified version of the Van Heukelem and Thomas (2001) method. The separated pigments were detected at 436 nm and identified against standard spectra using Waters Empower software. Standards for HPLC system calibration were obtained from Sigma (USA) and DHI (Denmark).

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Into the Bends of Time is a 40-minute work in seven movements for a large chamber orchestra with electronics, utilizing real-time computer-assisted processing of music performed by live musicians. The piece explores various combinations of interactive relationships between players and electronics, ranging from relatively basic processing effects to musical gestures achieved through stages of computer analysis, in which resulting sounds are crafted according to parameters of the incoming musical material. Additionally, some elements of interaction are multi-dimensional, in that they rely on the participation of two or more performers fulfilling distinct roles in the interactive process with the computer in order to generate musical material. Through processes of controlled randomness, several electronic effects induce elements of chance into their realization so that no two performances of this work are exactly alike. The piece gets its name from the notion that real-time computer-assisted processing, in which sound pressure waves are transduced into electrical energy, converted to digital data, artfully modified, converted back into electrical energy and transduced into sound waves, represents a “bending” of time.

The Bill Evans Trio featuring bassist Scott LaFaro and drummer Paul Motian is widely regarded as one of the most important and influential piano trios in the history of jazz, lauded for its unparalleled level of group interaction. Most analyses of Bill Evans’ recordings, however, focus on his playing alone and fail to take group interaction into account. This paper examines one performance in particular, of Victor Young’s “My Foolish Heart” as recorded in a live performance by the Bill Evans Trio in 1961. In Part One, I discuss Steve Larson’s theory of musical forces (expanded by Robert S. Hatten) and its applicability to jazz performance. I examine other recordings of ballads by this same trio in order to draw observations about normative ballad performance practice. I discuss meter and phrase structure and show how the relationship between the two is fixed in a formal structure of repeated choruses. I then develop a model of perpetual motion based on the musical forces inherent in this structure. In Part Two, I offer a full transcription and close analysis of “My Foolish Heart,” showing how elements of group interaction work with and against the musical forces inherent in the model of perpetual motion to achieve an unconventional, dynamic use of double-time. I explore the concept of a unified agential persona and discuss its role in imparting the song’s inherent rhetorical tension to the instrumental musical discourse.