880 resultados para LISP (Computer program language)
Resumo:
A set of algorithms, which allows a computer to determine the answers of simulated patients during pure tone and speech audiometry, is presented. Based on these algorithms, a computer program for training in audiometry was written and found to be useful for teaching purposes.
Resumo:
PURPOSE: A microangiographical technique is described, which allows visualization of small and capillary blood vessels and quantification of fasciocutaneous blood vessels by means of digital computer analysis in very small laboratory animals. MATERIALS AND METHODS: The left carotid artery of 20 nu/nu mice was cannulated (26 gauge) and a mixture of gelatin, bariumsulfate, and green ink was injected according to standardized protocol. Fasciocutaneous blood vessels were visualized by digital mammography and analyzed for vessel length and vessel surface area as standardized units [SU] by computer program. RESULTS: With the described microangiography method, fasciocutaneous blood vessels down to capillary size level can be clearly visualized. Regions of interest (ROIs) can be defined and the containing vascular network quantified. Comparable results may be obtained by calculating the microvascular area index (MAI) and the microvascular length index (MLI), related to the ROIs size. Identical ROIs showed a high reproducibility for measured [SU] < 0.01 +/- 0.0012%. CONCLUSION: Combining microsurgical techniques, pharmacological knowledge, and modern digital image technology, we were able to visualize small and capillary blood vessels even in small laboratory animals. By using our own computer analytical program, quantification of vessels was reliable, highly reproducible, and fast.
Resumo:
OBJECTIVE: The capability of drinks and foods to resist pH changes brought about by salivary buffering may play an important role in the dental erosion process in children. The aim of the present study was to test fruit yogurt, a popular snack for children, and the degrees of saturation (pK-pl) with respect to hydroxyapatite and fluorapatite to determine their erosive potential. METHOD AND MATERIALS: A variety of fruit yogurt was tested. To test the pH, 8 readings were taken with a pH electrode for each yogurt. Calcium content was detected by atomic absorption spectrophotometer, phosphorus by the inductively coupled plasma method, and fluoride content by ion chromatography. The degrees of saturation of hydroxyapatite and fluorapatite were calculated by use of a computer program. Statistical analysis was performed using 2-tailed analysis of variance (P < .05) and a post hoc test (Tukey) to determine differences between groups. RESULTS: The pH of each fruit concentrate was significantly different, except for banana yogurt. Except for the phosphorus content of raspberry yogurt, the calcium and phosphorus content for each fruit concentrate were significantly different. Fluoride levels were the same for all yogurts tested, and the degrees of saturation of hydroxyapatite and fluorapatite was positive, indicating supersaturation. CONCLUSION: It could be stated that fruit yogurt has no erosive potential.
Resumo:
OBJECTIVE: The capability of drinks and foods to resist pH changes brought about by salivary buffering may play an important role in the erosion of dental enamel. The aim of the present study was to measure the initial pH of several types of yogurt and to test the degrees of saturation (pK-pl) with respect to hydroxyapatite and fluorapatite to determine the buffering capacity and related erosive potential of yogurt. METHOD AND MATERIALS: Twenty-five milliliters of 7 types of freshly opened yogurt was titrated with 1 mol/L of sodium hydroxide, added in 0.5 mL increments, until the pH reached 10, to assess the total titratable acidity, a measure of the drink's own buffering capacity. The degrees of saturation (pK-pl) with respect to hydroxyapatite and fluorapatite were also calculated, using a computer program developed for this purpose. For statistical analysis, samples were compared using Kruskal-Wallis test. RESULTS: The buffering capacities can be ordered as follows: fruit yogurt >low-fat yogurt >bioyogurt >butter yogurt >natural yogurt >light fruit yogurt >light yogurt. The results suggest that, in vitro, fruit yogurt has the greatest buffering capacity. CONCLUSION: It can be stated that it is not possible to induce erosion on enamel with any type of yogurt.
Resumo:
This morning Dr. Risser will introduce you to the basic ideas of social network analysis. You will learn some history behind the study of social networks. Dr. Risser will introduce you to mathematical measures of social networks including centrality measures and measures of spread and cohesion. You will also learn how to use a computer program to analyze social network data
Resumo:
This morning Ms. Hayden will introduce you to a computer program designed to help you create a poster for presentation tomorrow.
Resumo:
BACKGROUND: Various osteotomy techniques have been developed to correct the deformity caused by slipped capital femoral epiphysis (SCFE) and compared by their clinical outcomes. The aim of the presented study was to compare an intertrochanteric uniplanar flexion osteotomy with a multiplanar osteotomy by their ability to improve postoperative range of motion as measured by simulation of computed tomographic data in patients with SCFE. METHODS: We examined 19 patients with moderate or severe SCFE as classified based on slippage angle. A computer program for the simulation of movement and osteotomy developed in our laboratory was used for study execution. According to a 3-dimensional reconstruction of the computed tomographic data, the physiological range was determined by flexion, abduction, and internal rotation. The multiplanar osteotomy was compared with the uniplanar flexion osteotomy. Both intertrochanteric osteotomy techniques were simulated, and the improvements of the movement range were assessed and compared. RESULTS: The mean slipping and thus correction angles measured were 25 degrees (range, 8-46 degrees) inferior and 54 degrees (range, 32-78 degrees) posterior. After the simulation of multiplanar osteotomy, the virtually measured ranges of motion as determined by bone-to-bone contact were 61 degrees for flexion, 57 degrees for abduction, and 66 degrees for internal rotation. The simulation of the uniplanar flexion osteotomy achieved a flexion of 63 degrees, an abduction of 36 degrees, and an internal rotation of 54 degrees. CONCLUSIONS: Apart from abduction, the improvement in the range of motion by a uniplanar flexion osteotomy is comparable with that of the multiplanar osteotomy. However, the improvement in flexion for the simulation of both techniques is not satisfactory with regard to the requirements of normal everyday life, in contrast to abduction and internal rotation. LEVEL OF EVIDENCE: Level III, Retrospective comparative study.
Resumo:
After 20 years of silence, two recent references from the Czech Republic (Bezpečnostní softwarová asociace, Case C-393/09) and from the English High Court (SAS Institute, Case C-406/10) touch upon several questions that are fundamental for the extent of copyright protection for software under the Computer Program Directive 91/25 (now 2009/24) and the Information Society Directive 2001/29. In Case C-393/09, the European Court of Justice held that “the object of the protection conferred by that directive is the expression in any form of a computer program which permits reproduction in different computer languages, such as the source code and the object code.” As “any form of expression of a computer program must be protected from the moment when its reproduction would engender the reproduction of the computer program itself, thus enabling the computer to perform its task,” a graphical user interface (GUI) is not protected under the Computer Program Directive, as it does “not enable the reproduction of that computer program, but merely constitutes one element of that program by means of which users make use of the features of that program.” While the definition of computer program and the exclusion of GUIs mirror earlier jurisprudence in the Member States and therefore do not come as a surprise, the main significance of Case C-393/09 lies in its interpretation of the Information Society Directive. In confirming that a GUI “can, as a work, be protected by copyright if it is its author’s own intellectual creation,” the ECJ continues the Europeanization of the definition of “work” which began in Infopaq (Case C-5/08). Moreover, the Court elaborated this concept further by excluding expressions from copyright protection which are dictated by their technical function. Even more importantly, the ECJ held that a television broadcasting of a GUI does not constitute a communication to the public, as the individuals cannot have access to the “essential element characterising the interface,” i.e., the interaction with the user. The exclusion of elements dictated by technical functions from copyright protection and the interpretation of the right of communication to the public with reference to the “essential element characterising” the work may be seen as welcome limitations of copyright protection in the interest of a free public domain which were not yet apparent in Infopaq. While Case C-393/09 has given a first definition of the computer program, the pending reference in Case C-406/10 is likely to clarify the scope of protection against nonliteral copying, namely in how far the protection extends beyond the text of the source code to the design of a computer program and where the limits of protection lie as regards the functionality of a program and mere “principles and ideas.” In light of the travaux préparatoires, it is submitted that the ECJ is also likely to grant protection for the design of a computer program, while excluding both the functionality and underlying principles and ideas from protection under the European copyright directives.
Resumo:
Since the UsedSoft ruling of the CJEU in 2012, there has been the distinct feeling that – like the big bang - UsedSoft signals the start of a new beginning. As we enter this brave new world, the Copyright Directive will be read anew: misalignments in the treatment of physical and digital content will be resolved; accessibility and affordability for consumers will be heightened; and lock-in will be reduced as e-exhaustion takes hold. With UsedSoft as a precedent, the Court can do nothing but keep expanding its own ruling. For big bang theorists, it is only a matter of time until the digital first sale meteor strikes non-software downloads also. This paper looks at whether the UsedSoft ruling could indeed be the beginning of a wider doctrine of e-exhaustion, or if it is simply a one-shot comet restrained by provisions of the Computer Program Directive on which it was based. Fighting the latter corner, we have the strict word of the law; in the UsedSoft ruling, the Court appears to willingly bypass the international legal framework of the WCT. As far as expansion goes, the Copyright Directive was conceived specifically to implement the WCT, thus the legislative intent is clear. The Court would not, surely, invoke its modicum of creativity there also... With perhaps undue haste in a digital market of many unknowns, it seems this might well be the case. Provoking the big bang theory of e-exhaustion, the UsedSoft ruling can be read as distinctly purposive, but rather than having copyright norms in mind, the standard for the Court is the same free movement rules that underpin the exhaustion doctrine in the physical world. With an endowed sense of principled equivalence, the Court clearly wishes the tangible and intangible rules to be aligned. Against the backdrop of the European internal market, perhaps few legislative instruments would staunchly stand in its way. With firm objectives in mind, the UsedSoft ruling could be a rather disruptive meteor indeed.
Resumo:
Ökobilanzierung von Produktsystemen dient der Abschätzung ihrer Wirkung auf die Umwelt. Eine vollständige Lebenswegbetrachtung erfordert auch die Einbeziehung intralogistischer Transportprozesse bzw. -mittel. Für die Erstellung von Ökobilanzen wird i. d. R. ein Computerprogramm verwendet. Die Demoversionen dreier kommerzieller Softwarelösungen (SimaPro, GaBi und Umberto NXT LCA) und die Vollversion einer Open Source Software (openLCA) wurden aus softwareergonomischer Sicht analysiert. Hierzu erfolgte u. a. der Nachbau der bereitgestellten Tutorials bzw. die Modellierung eigener Produktsysteme. Im Rahmen der Analyse wurden die Punkte • Entstehung, Verbreitung, Zielgruppe, • Eignung der Tutorials, Erlernbarkeit, • Grafische Benutzeroberfläche, Individualisierbarkeit der Software, • Umsetzung der Anforderungen aus den Ökobilanzierungsnormen, • Notwendige Arbeitsschritte zur Erstellung einer Ökobilanz einer vergleichenden Betrachtung unterzogen. Der Beitrag beinhaltet eine Einführung in die wesentlichen Prinzipien der Ökobilanzierung und die Grundsätze der Softwareergonomie. Diese werden zu softwareergonomischen Eigenschaften für Ökobilanzsoftwarelösungen subsumiert. Anschließend werden die Ergebnisse des Softwarevergleiches dargestellt. Abschließend erfolgt eine Zusammenfassung der Erkenntnisse.
Resumo:
Models of DNA sequence evolution and methods for estimating evolutionary distances are needed for studying the rate and pattern of molecular evolution and for inferring the evolutionary relationships of organisms or genes. In this dissertation, several new models and methods are developed.^ The rate variation among nucleotide sites: To obtain unbiased estimates of evolutionary distances, the rate heterogeneity among nucleotide sites of a gene should be considered. Commonly, it is assumed that the substitution rate varies among sites according to a gamma distribution (gamma model) or, more generally, an invariant+gamma model which includes some invariable sites. A maximum likelihood (ML) approach was developed for estimating the shape parameter of the gamma distribution $(\alpha)$ and/or the proportion of invariable sites $(\theta).$ Computer simulation showed that (1) under the gamma model, $\alpha$ can be well estimated from 3 or 4 sequences if the sequence length is long; and (2) the distance estimate is unbiased and robust against violations of the assumptions of the invariant+gamma model.^ However, this ML method requires a huge amount of computational time and is useful only for less than 6 sequences. Therefore, I developed a fast method for estimating $\alpha,$ which is easy to implement and requires no knowledge of tree. A computer program was developed for estimating $\alpha$ and evolutionary distances, which can handle the number of sequences as large as 30.^ Evolutionary distances under the stationary, time-reversible (SR) model: The SR model is a general model of nucleotide substitution, which assumes (i) stationary nucleotide frequencies and (ii) time-reversibility. It can be extended to SRV model which allows rate variation among sites. I developed a method for estimating the distance under the SR or SRV model, as well as the variance-covariance matrix of distances. Computer simulation showed that the SR method is better than a simpler method when the sequence length $L>1,000$ bp and is robust against deviations from time-reversibility. As expected, when the rate varies among sites, the SRV method is much better than the SR method.^ The evolutionary distances under nonstationary nucleotide frequencies: The statistical properties of the paralinear and LogDet distances under nonstationary nucleotide frequencies were studied. First, I developed formulas for correcting the estimation biases of the paralinear and LogDet distances. The performances of these formulas and the formulas for sampling variances were examined by computer simulation. Second, I developed a method for estimating the variance-covariance matrix of the paralinear distance, so that statistical tests of phylogenies can be conducted when the nucleotide frequencies are nonstationary. Third, a new method for testing the molecular clock hypothesis was developed in the nonstationary case. ^
Resumo:
Health care providers face the problem of trying to make decisions with inadequate information and also with an overload of (often contradictory) information. Physicians often choose treatment long before they know which disease is present. Indeed, uncertainty is intrinsic to the practice of medicine. Decision analysis can help physicians structure and work through a medical decision problem, and can provide reassurance that decisions are rational and consistent with the beliefs and preferences of other physicians and patients. ^ The primary purpose of this research project is to develop the theory, methods, techniques and tools necessary for designing and implementing a system to support solving medical decision problems. A case study involving “abdominal pain” serves as a prototype for implementing the system. The research, however, focuses on a generic class of problems and aims at covering theoretical as well as practical aspects of the system developed. ^ The main contributions of this research are: (1) bridging the gap between the statistical approach and the knowledge-based (expert) approach to medical decision making; (2) linking a collection of methods, techniques and tools together to allow for the design of a medical decision support system, based on a framework that involves the Analytic Network Process (ANP), the generalization of the Analytic Hierarchy Process (AHP) to dependence and feedback, for problems involving diagnosis and treatment; (3) enhancing the representation and manipulation of uncertainty in the ANP framework by incorporating group consensus weights; and (4) developing a computer program to assist in the implementation of the system. ^
Resumo:
The lexical items like and well can serve as discourse markers (DMs), but can also play numerous other roles, such as verb or adverb. Identifying the occurrences that function as DMs is an important step for language understanding by computers. In this study, automatic classifiers using lexical, prosodic/positional and sociolinguistic features are trained over transcribed dialogues, manually annotated with DM information. The resulting classifiers improve state-of-the-art performance of DM identification, at about 90% recall and 79% precision for like (84.5% accuracy, κ = 0.69), and 99% recall and 98% precision for well (97.5% accuracy, κ = 0.88). Automatic feature analysis shows that lexical collocations are the most reliable indicators, followed by prosodic/positional features, while sociolinguistic features are marginally useful for the identification of DM like and not useful for well. The differentiated processing of each type of DM improves classification accuracy, suggesting that these types should be treated individually.
Resumo:
Prostate cancer is the most common incident cancer and the second leading cause of death in men in the United States. Although numerous attempts have been made to identify risk factors associated with prostate cancer, the results have been inconsistent and conflicting. The only established risk factors are age and ethnicity. A positive family history of prostate cancer has also been shown to increase the risk two- to three-fold among close relatives.^ There are several similarities between breast and prostate cancer that make the relationship between the two of interest. (1) Histologically, both cancers are predominantly adenocarcinomas, (2) both organs have a sexual and/or reproductive role, (3) both cancers occur in hormone-responsive tissue, (4) therapy often consists of hormonal manipulation, (5) worldwide distribution patterns of prostate and breast cancer are positively correlated.^ A family history study was conducted to evaluate the aggregation of prostate cancer and co-aggregation of breast cancer in 149 patients referred to The University of Texas, M.D. Anderson Cancer Center with newly diagnosed prostate cancer. All patients were white, less than 75 years of age at diagnosis and permanent residents of the United States. Through a personal interview with the proband, family histories were collected on 1,128 first-degree relatives. Cancer diagnoses were verified through medical records or death certificate. Standardized incidence ratios were calculated using a computer program by Monson incorporating data from Connecticut Tumor Registry.^ In this study, familial aggregation of prostate cancer was verified only among the brothers, not among fathers. Although a statistically significant excess of breast cancer was not found, the increased point estimates in mothers, sisters and daughters are consistent with a co-aggregation hypothesis. Rather surprising was the finding of a seven-fold increased risk of prostate cancer and a three-fold increased risk of breast cancer among siblings in the presence of a maternal history of any cancer. Larger family history studies including high risk (African-Americans) and lower-risk groups (Hispanics) and incorporating molecular genetic evaluations should be conducted to determine if genetic differences play a role in the differential incidence rates across ethnic groups. ^
Resumo:
Date-32 is a fast and easily used computer program developed to date Quaternary deep-sea cores by associating variations in the earth's orbit with recurring oscillations in core properties, such as carbonate content or isotope composition. Starting with known top and bottom dates, distortions in the periodicities of the core properties due to varying sedimentation rates are realigned by fast Fourier analysis so as to maximise the spectral energy density at the orbital frequencies. This allows age interpolation to all parts of the core to an accuracy of 10 kyrs, or about 1.5% of the record duration for a typical Brunhes sequence. The influence of astronomical forcing is examined and the method is applied to provide preliminary dates in a high-resolution Brunhes record from DSDP Site 594 off southeastern New Zealand.