991 resultados para Computer input-outpus equipment.


Relevância:

20.00% 20.00%

Publicador:

Resumo:

We present an algorithm for the computation of reducible invariant tori of discrete dynamical systems that is suitable for tori of dimensions larger than 1. It is based on a quadratically convergent scheme that approximates, at the same time, the Fourier series of the torus, its Floquet transformation, and its Floquet matrix. The Floquet matrix describes the linearization of the dynamics around the torus and, hence, its linear stability. The algorithm presents a high degree of parallelism, and the computational effort grows linearly with the number of Fourier modes needed to represent the solution. For these reasons it is a very good option to compute quasi-periodic solutions with several basic frequencies. The paper includes some examples (flows) to show the efficiency of the method in a parallel computer. In these flows we compute invariant tori of dimensions up to 5, by taking suitable sections.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

This thesis seeks to answer, if communication challenges in virtual teams can be overcome with the help of computer-mediated communication. Virtual teams are becoming more common work method in many global companies. In order for virtual teams to reach their maximum potential, effective asynchronous and synchronous methods for communication are needed. The thesis covers communication in virtual teams, as well as leadership and trust building in virtual environments with the help of CMC. First, the communication challenges in virtual teams are identified by using a framework of knowledge sharing barriers in virtual teams by Rosen et al. (2007) Secondly, the leadership and trust in virtual teams are defined in the context of CMC. The performance of virtual teams is evaluated in the case study by exploiting these three dimensions. With the help of a case study of two virtual teams, the practical issues related to selecting and implementing communication technologies as well as overcoming knowledge sharing barriers is being discussed. The case studies involve a complex inter-organisational setting, where four companies are working together in order to maintain a new IT system. The communication difficulties are related to inadequate communication technologies, lack of trust and the undefined relationships of the stakeholders and the team members. As a result, it is suggested that communication technologies are needed in order to improve the virtual team performance, but are not however solely capable of solving the communication challenges in virtual teams. In addition, suitable leadership and trust between team members are required in order to improve the knowledge sharing and communication in virtual teams.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Current-day web search engines (e.g., Google) do not crawl and index a significant portion of theWeb and, hence, web users relying on search engines only are unable to discover and access a large amount of information from the non-indexable part of the Web. Specifically, dynamic pages generated based on parameters provided by a user via web search forms (or search interfaces) are not indexed by search engines and cannot be found in searchers’ results. Such search interfaces provide web users with an online access to myriads of databases on the Web. In order to obtain some information from a web database of interest, a user issues his/her query by specifying query terms in a search form and receives the query results, a set of dynamic pages that embed required information from a database. At the same time, issuing a query via an arbitrary search interface is an extremely complex task for any kind of automatic agents including web crawlers, which, at least up to the present day, do not even attempt to pass through web forms on a large scale. In this thesis, our primary and key object of study is a huge portion of the Web (hereafter referred as the deep Web) hidden behind web search interfaces. We concentrate on three classes of problems around the deep Web: characterization of deep Web, finding and classifying deep web resources, and querying web databases. Characterizing deep Web: Though the term deep Web was coined in 2000, which is sufficiently long ago for any web-related concept/technology, we still do not know many important characteristics of the deep Web. Another matter of concern is that surveys of the deep Web existing so far are predominantly based on study of deep web sites in English. One can then expect that findings from these surveys may be biased, especially owing to a steady increase in non-English web content. In this way, surveying of national segments of the deep Web is of interest not only to national communities but to the whole web community as well. In this thesis, we propose two new methods for estimating the main parameters of deep Web. We use the suggested methods to estimate the scale of one specific national segment of the Web and report our findings. We also build and make publicly available a dataset describing more than 200 web databases from the national segment of the Web. Finding deep web resources: The deep Web has been growing at a very fast pace. It has been estimated that there are hundred thousands of deep web sites. Due to the huge volume of information in the deep Web, there has been a significant interest to approaches that allow users and computer applications to leverage this information. Most approaches assumed that search interfaces to web databases of interest are already discovered and known to query systems. However, such assumptions do not hold true mostly because of the large scale of the deep Web – indeed, for any given domain of interest there are too many web databases with relevant content. Thus, the ability to locate search interfaces to web databases becomes a key requirement for any application accessing the deep Web. In this thesis, we describe the architecture of the I-Crawler, a system for finding and classifying search interfaces. Specifically, the I-Crawler is intentionally designed to be used in deepWeb characterization studies and for constructing directories of deep web resources. Unlike almost all other approaches to the deep Web existing so far, the I-Crawler is able to recognize and analyze JavaScript-rich and non-HTML searchable forms. Querying web databases: Retrieving information by filling out web search forms is a typical task for a web user. This is all the more so as interfaces of conventional search engines are also web forms. At present, a user needs to manually provide input values to search interfaces and then extract required data from the pages with results. The manual filling out forms is not feasible and cumbersome in cases of complex queries but such kind of queries are essential for many web searches especially in the area of e-commerce. In this way, the automation of querying and retrieving data behind search interfaces is desirable and essential for such tasks as building domain-independent deep web crawlers and automated web agents, searching for domain-specific information (vertical search engines), and for extraction and integration of information from various deep web resources. We present a data model for representing search interfaces and discuss techniques for extracting field labels, client-side scripts and structured data from HTML pages. We also describe a representation of result pages and discuss how to extract and store results of form queries. Besides, we present a user-friendly and expressive form query language that allows one to retrieve information behind search interfaces and extract useful data from the result pages based on specified conditions. We implement a prototype system for querying web databases and describe its architecture and components design.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Mertaniemen voimalaitoksien prosessitietokone (PTK) on uusittu keväällä 2005. Tämän työn tarkoituksena on ollut auttaa PTK:n virheiden korjaamisessa ja puut-teiden kartoittamisessa. Työssä on keskitytty etenkin prosessiraportoinnin tekemiseen. Työn alussa on kerrottu Mertaniemen voimalaitoksen tekniset tiedot ja PTK:n hankinnan taustatietoja. Uudesta PTK-järjestelmästä on kuvattu laitteisto, sovellus ja perusohjelmistot. PTK:n ja muiden järjestelmien välinen tiedonsiirto on myös kuvattu. PTK muuttujien nimeäminen on esitelty, jotta olisi helpompi hahmottaa työssä käytettyjen positioiden merkityksiä. Prosessiraportoinnin kehittämisessä kuvataan raporttien tarvetta ja niiden sisältöä sekä sitä kuinka raportit on tehty. Päästöraportointi on esitetty omana osa-alueenaan, koska voimalaitosten päästöjen seurantaa edellytetään tehtävän viran¬omaismääräysten ja EU-direktiivien vaatimusten mukaisesti. Raporttien lisäksi prosessiarvojen seuraamista helpottamaan on tehty yhteisiä trendi- ja työtilanäyttöjä. PTK:n ongelmakohtina on käsitelty muuttujien tunnuksissa ja nimissä olevat virheet sekä PTK laskennan tarkastaminen. Muuttujien nimien ja laskennan tarkas¬tusta tehtiin prosessiraportoinnin tekemisen yhteydessä sekä yhteistyössä PTK-järjestelmän toimittaneen Metso Automation Oy:n kanssa. Päästölaskennan korjaaminen oli erityisen tärkeää.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Following their detection and seizure by police and border guard authorities, false identity and travel documents are usually scanned, producing digital images. This research investigates the potential of these images to classify false identity documents, highlight links between documents produced by a same modus operandi or same source, and thus support forensic intelligence efforts. Inspired by previous research work about digital images of Ecstasy tablets, a systematic and complete method has been developed to acquire, collect, process and compare images of false identity documents. This first part of the article highlights the critical steps of the method and the development of a prototype that processes regions of interest extracted from images. Acquisition conditions have been fine-tuned in order to optimise reproducibility and comparability of images. Different filters and comparison metrics have been evaluated and the performance of the method has been assessed using two calibration and validation sets of documents, made up of 101 Italian driving licenses and 96 Portuguese passports seized in Switzerland, among which some were known to come from common sources. Results indicate that the use of Hue and Edge filters or their combination to extract profiles from images, and then the comparison of profiles with a Canberra distance-based metric provides the most accurate classification of documents. The method appears also to be quick, efficient and inexpensive. It can be easily operated from remote locations and shared amongst different organisations, which makes it very convenient for future operational applications. The method could serve as a first fast triage method that may help target more resource-intensive profiling methods (based on a visual, physical or chemical examination of documents for instance). Its contribution to forensic intelligence and its application to several sets of false identity documents seized by police and border guards will be developed in a forthcoming article (part II).

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Objectives: The present study evaluates the reliability of the Radio Memory® software (Radio Memory; Belo Horizonte,Brasil.) on classifying lower third molars, analyzing intra- and interexaminer agreement of the results. Study Design: An observational, descriptive study of 280 lower third molars was made. The corresponding orthopantomographs were analyzed by two examiners using the Radio Memory® software. The exam was repeated 30 days after the first observation by each examiner. Both intra- and interexaminer agreement were determined using the SPSS v 12.0 software package for Windows (SPSS; Chicago, USA). Results: Intra- and interexaminer agreement was shown for both the Pell & Gregory and the Winter classifications, p<0.01, with 99% significant correlation between variables in all the cases. Conclusions: The use of Radio Memory® software for the classification of lower third molars is shown to be a valid alternative to the conventional method (direct evaluation on the orthopantomograph), for both clinical and investigational applications.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

This manual describes how to run the new produced GUI C++ program that so called'WM' program. Section two describes the instructions of the program installation.Section three illustrates test runs description including running the program WM,sample of the input, output files, in addition to some generated graphs followed by the main form of the program created by using the Borland C++ Builder 6.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

A la UdGhi ha 2 aerogeneradors minieòlics: el del terrat del P2 i un de més petit allaboratori d’energies. Aquest segon aerogenerador minieòlic és el que s’ha utilitzat en aquestprojecte. Es tracta d’un Air-X de la casa Technosun amb les següents característiques:- Té un pes de 6Kg, un radi de 0,582 metres, un TSR de 8,8 i unapotència de 545W.- Perfil de la pala tipus SD2030.- Velocitat d’engegada de 3m/s.-Amb vents forts (més de 15m/s) un dispositiu electrònic redueix lavelocitat fins a 600rpm, reduint les càrregues sobre la turbina i l’estructuramentre encara segueix produint energia.- Baix manteniment. Només consta de dues parts mòbils.L’objecte que s’ha plantejat per aquest projecte ha estat trobar la corbade potència del minigenerador Air-X mitjançant simulació amb CFD, iutilitzant només les dades geomètriques de l’aparell

Relevância:

20.00% 20.00%

Publicador:

Resumo:

AbstractObjective:To compare the accuracy of computer-aided ultrasound (US) and magnetic resonance imaging (MRI) by means of hepatorenal gradient analysis in the evaluation of nonalcoholic fatty liver disease (NAFLD) in adolescents.Materials and Methods:This prospective, cross-sectional study evaluated 50 adolescents (aged 11–17 years), including 24 obese and 26 eutrophic individuals. All adolescents underwent computer-aided US, MRI, laboratory tests, and anthropometric evaluation. Sensitivity, specificity, positive and negative predictive values and accuracy were evaluated for both imaging methods, with subsequent generation of the receiver operating characteristic (ROC) curve and calculation of the area under the ROC curve to determine the most appropriate cutoff point for the hepatorenal gradient in order to predict the degree of steatosis, utilizing MRI results as the gold-standard.Results:The obese group included 29.2% girls and 70.8% boys, and the eutrophic group, 69.2% girls and 30.8% boys. The prevalence of NAFLD corresponded to 19.2% for the eutrophic group and 83% for the obese group. The ROC curve generated for the hepatorenal gradient with a cutoff point of 13 presented 100% sensitivity and 100% specificity. As the same cutoff point was considered for the eutrophic group, false-positive results were observed in 9.5% of cases (90.5% specificity) and false-negative results in 0% (100% sensitivity).Conclusion:Computer-aided US with hepatorenal gradient calculation is a simple and noninvasive technique for semiquantitative evaluation of hepatic echogenicity and could be useful in the follow-up of adolescents with NAFLD, population screening for this disease as well as for clinical studies.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

A BASIC computer program (REMOVAL) was developed to compute in a VAXNMS environment all the calculations of the removal method for population size estimation (catch-effort method for closed populations with constant sampling effort). The program follows the maximum likelihood methodology,checks the failure conditions, applies the appropriate formula, and displays the estimates of population size and catchability, with their standard deviations and coefficients of variation, and two goodness-of-fit statistics with their significance levels. Data of removal experiments for the cyprinodontid fish Aphanius iberus in the Alt Emporda wetlands are used to exemplify the use of the program

Relevância:

20.00% 20.00%

Publicador:

Resumo:

In order to develop applications for z;isual interpretation of medical images, the early detection and evaluation of microcalcifications in digital mammograms is verg important since their presence is oftenassociated with a high incidence of breast cancers. Accurate classification into benign and malignant groups would help improve diagnostic sensitivity as well as reduce the number of unnecessa y biopsies. The challenge here is the selection of the useful features to distinguish benign from malignant micro calcifications. Our purpose in this work is to analyse a microcalcification evaluation method based on a set of shapebased features extracted from the digitised mammography. The segmentation of the microcalcificationsis performed using a fixed-tolerance region growing method to extract boundaries of calcifications with manually selected seed pixels. Taking into account that shapes and sizes of clustered microcalcificationshave been associated with a high risk of carcinoma based on digerent subjective measures, such as whether or not the calcifications are irregular, linear, vermiform, branched, rounded or ring like, our efforts were addressed to obtain a feature set related to the shape. The identification of the pammeters concerning the malignant character of the microcalcifications was performed on a set of 146 mammograms with their real diagnosis known in advance from biopsies. This allowed identifying the following shape-based parameters as the relevant ones: Number of clusters, Number of holes, Area, Feret elongation, Roughness, and Elongation. Further experiments on a set of 70 new mammogmms showed that the performance of the classification scheme is close to the mean performance of three expert radiologists, which allows to consider the proposed method for assisting the diagnosis and encourages to continue the investigation in the senseof adding new features not only related to the shape

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Global warming mitigation has recently become a priority worldwide. A large body of literature dealing with energy related problems has focused on reducing greenhouse gases emissions at an engineering scale. In contrast, the minimization of climate change at a wider macroeconomic level has so far received much less attention. We investigate here the issue of how to mitigate global warming by performing changes in an economy. To this end, we make use of a systematic tool that combines three methods: linear programming, environmentally extended input output models, and life cycle assessment principles. The problem of identifying key economic sectors that contribute significantly to global warming is posed in mathematical terms as a bi criteria linear program that seeks to optimize simultaneously the total economic output and the total life cycle CO2 emissions. We have applied this approach to the European Union economy, finding that significant reductions in global warming potential can be attained by regulating specific economic sectors. Our tool is intended to aid policymakers in the design of more effective public policies for achieving the environmental and economic targets sought.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Background and purpose: In planning to meet evidence based needs for radiotherapy, guidelines for the provision of capital and human resources are central if access, quality and safety are not to be compromised. A component of the ESTRO-HERO (Health Economics in Radiation Oncology) project is to document the current availability and content of guidelines for radiotherapy in Europe. Materials and methods: An 84 part questionnaire was distributed to the European countries through their national scientific and professional radiotherapy societies with 30 items relating to the availability of guidelines for equipment and staffing and selected operational issues. Twenty-nine countries provided full or partial evaluable responses. Results: The availability of guidelines across Europe is far from uniform. The metrics used for capital and human resources are variable. There seem to have been no major changes in the availability or specifics of guidelines over the ten-year period since the QUARTS study with the exception of the recent expansion of RTT staffing models. Where comparison is possible it appears that staffing for radiation oncologists, medical physicists and particularly RTTs tend to exceed guidelines suggesting developments in clinical radiotherapy are moving faster than guideline updating. Conclusion: The efficient provision of safe, high quality radiotherapy services would benefit from the availability of well-structured guidelines for capital and human resources, based on agreed upon metrics, which could be linked to detailed estimates of need