886 resultados para Machine Typed Document
Resumo:
Although cross-sectional diffusion tensor imaging (DTI) studies revealed significant white matter changes in mild cognitive impairment (MCI), the utility of this technique in predicting further cognitive decline is debated. Thirty-five healthy controls (HC) and 67 MCI subjects with DTI baseline data were neuropsychologically assessed at one year. Among them, there were 40 stable (sMCI; 9 single domain amnestic, 7 single domain frontal, 24 multiple domain) and 27 were progressive (pMCI; 7 single domain amnestic, 4 single domain frontal, 16 multiple domain). Fractional anisotropy (FA) and longitudinal, radial, and mean diffusivity were measured using Tract-Based Spatial Statistics. Statistics included group comparisons and individual classification of MCI cases using support vector machines (SVM). FA was significantly higher in HC compared to MCI in a distributed network including the ventral part of the corpus callosum, right temporal and frontal pathways. There were no significant group-level differences between sMCI versus pMCI or between MCI subtypes after correction for multiple comparisons. However, SVM analysis allowed for an individual classification with accuracies up to 91.4% (HC versus MCI) and 98.4% (sMCI versus pMCI). When considering the MCI subgroups separately, the minimum SVM classification accuracy for stable versus progressive cognitive decline was 97.5% in the multiple domain MCI group. SVM analysis of DTI data provided highly accurate individual classification of stable versus progressive MCI regardless of MCI subtype, indicating that this method may become an easily applicable tool for early individual detection of MCI subjects evolving to dementia.
Resumo:
Proponents of microalgae biofuel technologies often claim that the world demand of liquid fuels, about 5 trillion liters per year, could be supplied by microalgae cultivated on only a few tens of millions of hectares. This perspective reviews this subject and points out that such projections are greatly exaggerated, because (1) the pro- ductivities achieved in large-scale commercial microalgae production systems, operated year-round, do not surpass those of irrigated tropical crops; (2) cultivating, harvesting and processing microalgae solely for the production of biofuels is simply too expensive using current or prospective technology; and (3) currently available (limited) data suggest that the energy balance of algal biofuels is very poor. Thus, microalgal biofuels are no panacea for depleting oil or global warming, and are unlikely to save the internal combustion machine.
Resumo:
The Road Rater is a dynamic deflection measuring appa-ratus for flexible base pavements. The basic operating principle of the Road Rater is to impart a dynamic loading and measure the resultant movement of the pavement with velocity sensors. This data, when properly adjusted for temperature by use of a nomograph included in this report, can be used to determine pavement life expectancy and estimate overlay thickness required. Road Rater testing will be conducted in the spring, when pave-ments are in their weakest condition, until seasonal correction factors can be developed. The Road Rater does not have sufficient ram weight to effectively evaluate load carrying capacity of rigid pavements. All rigid pavements react similarly to Road Rater testing and generally deflect from 0.65 to 1.30 mils. Research will be continued to evaluate rigid pavements with the Road Rater, however. The Road Rater has proven to be a reliable, trouble free pavement evaluation machine. The deflection apparatus was originally front-mounted, but was rear-mounted during the winter of 1977-78. Since that time, van handling has greatly improved, and front suspension parts are no longer overstressed due to improper weight distribution. The Road Rater provides a fast, economical, nondestructive test method to evaluate flexible pavements. Road Rater test data can be used to predict pavement life, set priorities for asphaltic concrete resurfacing, and design asphaltic concrete overlays. Temperature and seasonal variations significantly affect Road Rater deflection readings and must be considered. A nomograph included in this report adjusts for temperature, but does not correct for seasonal effect. Road Rater testing will be conducted in the spring until seasonal correction factors can be developed. The Road Rater has not successfully evaluated rigid pavements, but research will continue in this area. 1. Recommendations for continuing Road Rater research, evaluation and application are as follows:A computer program should be established to reduce Road Rater raw data (Range and Sensor reading) to HR-178 Road Rater Dynamic Deflections For Determining Structural Rating Of Flexible Pavements mean deflection (mils) and/or structural rating. This computer printout would be similar to present friction testing printouts, and would greatly reduce Road Rater data reduction manpower needs and costs. 2. Seasonal variation study should continue to develop seasonal correction factors. Seasonal test roads will be studied concurrently with routine testing during 1979 to develop this relationship. All Road Rater testing will be conducted in the spring until the seasonal relationship is established. 3. An asphaltic concrete overlay design method should be established based on Road Rater de-flection readings. The AASHTO Interim Guide for Design of Pavement Structures 1972 will be used as a base document for this study. 4. AASHTO Structural numbers should be compared to Road Rater Structural Ratings during 1979 on asphaltic concrete overlay projects. This analysis will enable us to refine Road Rater evaluation of flexible pavements. Roads will be tested before resurfacing and several months
Resumo:
Automatic environmental monitoring networks enforced by wireless communication technologies provide large and ever increasing volumes of data nowadays. The use of this information in natural hazard research is an important issue. Particularly useful for risk assessment and decision making are the spatial maps of hazard-related parameters produced from point observations and available auxiliary information. The purpose of this article is to present and explore the appropriate tools to process large amounts of available data and produce predictions at fine spatial scales. These are the algorithms of machine learning, which are aimed at non-parametric robust modelling of non-linear dependencies from empirical data. The computational efficiency of the data-driven methods allows producing the prediction maps in real time which makes them superior to physical models for the operational use in risk assessment and mitigation. Particularly, this situation encounters in spatial prediction of climatic variables (topo-climatic mapping). In complex topographies of the mountainous regions, the meteorological processes are highly influenced by the relief. The article shows how these relations, possibly regionalized and non-linear, can be modelled from data using the information from digital elevation models. The particular illustration of the developed methodology concerns the mapping of temperatures (including the situations of Föhn and temperature inversion) given the measurements taken from the Swiss meteorological monitoring network. The range of the methods used in the study includes data-driven feature selection, support vector algorithms and artificial neural networks.
Resumo:
The motivation for this research initiated from the abrupt rise and fall of minicomputers which were initially used both for industrial automation and business applications due to their significantly lower cost than their predecessors, the mainframes. Later industrial automation developed its own vertically integrated hardware and software to address the application needs of uninterrupted operations, real-time control and resilience to harsh environmental conditions. This has led to the creation of an independent industry, namely industrial automation used in PLC, DCS, SCADA and robot control systems. This industry employs today over 200'000 people in a profitable slow clockspeed context in contrast to the two mainstream computing industries of information technology (IT) focused on business applications and telecommunications focused on communications networks and hand-held devices. Already in 1990s it was foreseen that IT and communication would merge into one Information and communication industry (ICT). The fundamental question of the thesis is: Could industrial automation leverage a common technology platform with the newly formed ICT industry? Computer systems dominated by complex instruction set computers (CISC) were challenged during 1990s with higher performance reduced instruction set computers (RISC). RISC started to evolve parallel to the constant advancement of Moore's law. These developments created the high performance and low energy consumption System-on-Chip architecture (SoC). Unlike to the CISC processors RISC processor architecture is a separate industry from the RISC chip manufacturing industry. It also has several hardware independent software platforms consisting of integrated operating system, development environment, user interface and application market which enables customers to have more choices due to hardware independent real time capable software applications. An architecture disruption merged and the smartphone and tablet market were formed with new rules and new key players in the ICT industry. Today there are more RISC computer systems running Linux (or other Unix variants) than any other computer system. The astonishing rise of SoC based technologies and related software platforms in smartphones created in unit terms the largest installed base ever seen in the history of computers and is now being further extended by tablets. An underlying additional element of this transition is the increasing role of open source technologies both in software and hardware. This has driven the microprocessor based personal computer industry with few dominating closed operating system platforms into a steep decline. A significant factor in this process has been the separation of processor architecture and processor chip production and operating systems and application development platforms merger into integrated software platforms with proprietary application markets. Furthermore the pay-by-click marketing has changed the way applications development is compensated: Three essays on major trends in a slow clockspeed industry: The case of industrial automation 2014 freeware, ad based or licensed - all at a lower price and used by a wider customer base than ever before. Moreover, the concept of software maintenance contract is very remote in the app world. However, as a slow clockspeed industry, industrial automation has remained intact during the disruptions based on SoC and related software platforms in the ICT industries. Industrial automation incumbents continue to supply systems based on vertically integrated systems consisting of proprietary software and proprietary mainly microprocessor based hardware. They enjoy admirable profitability levels on a very narrow customer base due to strong technology-enabled customer lock-in and customers' high risk leverage as their production is dependent on fault-free operation of the industrial automation systems. When will this balance of power be disrupted? The thesis suggests how industrial automation could join the mainstream ICT industry and create an information, communication and automation (ICAT) industry. Lately the Internet of Things (loT) and weightless networks, a new standard leveraging frequency channels earlier occupied by TV broadcasting, have gradually started to change the rigid world of Machine to Machine (M2M) interaction. It is foreseeable that enough momentum will be created that the industrial automation market will in due course face an architecture disruption empowered by these new trends. This thesis examines the current state of industrial automation subject to the competition between the incumbents firstly through a research on cost competitiveness efforts in captive outsourcing of engineering, research and development and secondly researching process re- engineering in the case of complex system global software support. Thirdly we investigate the industry actors', namely customers, incumbents and newcomers, views on the future direction of industrial automation and conclude with our assessments of the possible routes industrial automation could advance taking into account the looming rise of the Internet of Things (loT) and weightless networks. Industrial automation is an industry dominated by a handful of global players each of them focusing on maintaining their own proprietary solutions. The rise of de facto standards like IBM PC, Unix and Linux and SoC leveraged by IBM, Compaq, Dell, HP, ARM, Apple, Google, Samsung and others have created new markets of personal computers, smartphone and tablets and will eventually also impact industrial automation through game changing commoditization and related control point and business model changes. This trend will inevitably continue, but the transition to a commoditized industrial automation will not happen in the near future.
Resumo:
This issue review provides an overview of the electronic documents management system, or EDMS project.
Resumo:
This issue review provides an overview of the electronic document management system, or EDMS, project, withing the judicial branch and courts.
Resumo:
Les systèmes d'assistance ventriculaire sont apparus durant la dernière décade comme une approche thérapeutique efficace du traitement de l'insuffisance cardiaque terminale, en particulier dans le contexte de manque de donneurs d'organes. Néanmoins, et ceci malgré les progrès techniques majeurs, les taux de complications restent élevés et sont en partie liés à la configuration géométrique, en particulier le site d'implantation de la cannule de sortie à l'aorte thoracique. Bien que l'anastomose à l'aorte descendante permette une chirurgie moins invasive, les bénéfices de cette technique sont toujours controversés, comparée à la méthode standard de l'aorte ascendante, en raison du risque thrombo-embolique possiblement augmenté et des modifications hémodynamiques induites au niveau de l'arc aortique. Dans ce travail, nous comparons in silico en terme de débit et pression les deux possibilités anastomotiques. Nous développons un réseau de modèles mathématiques unidimensionnels, et l'appliquons à diverses situations cliniques, pour différents stades d'insuffisance cardiaque et de vitesses de rotation de la machine. Les données initiales sont obtenues grâce à un modèle OD (c'est-à-dire qui dépend uniquement du temps mais pas de l'espace) du système cardiovasculaire comprenant une assistance circulatoire, validé avec des données cliniques. Les simulations réalisées montrent que les deux méthodes sont similaires, en terme de débit et courbes de pression, ceci pour tous les cas cliniques étudiés. Ces résultats numériques soutiennent la possibilité d'utiliser la technique d'anastomose à l'aorte thoracique descendante, permettant une chirurgie moins invasive. Sur un plan plus fondamental, le système cardiovasculaire peut être simulé par le biais de multiples modèles de niveau de complexité différents, au prix d'un coût computationnel toujours plus élevé. Nous évaluons les avantages de modèles géométriques à plusieurs échelles (uni- et tridimensionnelle) avec données provenant de patients, comparés à des modèles simplifiés. Les résultats montrent que ces modèles de dimensions hétérogènes apportent un bénéfice important en terme de ressources de calcul, tout en conservant une précision acceptable. En conclusion, ces résultats encourageant montrent la relevance des études numériques dans le domaine médical, tant sur le plan fondamental et la compréhension des mécanismes physiopathologiques, que sur le plan applicatif et le développement de nouvelles thérapeutiques.
Resumo:
The book presents the state of the art in machine learning algorithms (artificial neural networks of different architectures, support vector machines, etc.) as applied to the classification and mapping of spatially distributed environmental data. Basic geostatistical algorithms are presented as well. New trends in machine learning and their application to spatial data are given, and real case studies based on environmental and pollution data are carried out. The book provides a CD-ROM with the Machine Learning Office software, including sample sets of data, that will allow both students and researchers to put the concepts rapidly to practice.
Resumo:
Background: Research in epistasis or gene-gene interaction detection for human complex traits has grown over the last few years. It has been marked by promising methodological developments, improved translation efforts of statistical epistasis to biological epistasis and attempts to integrate different omics information sources into the epistasis screening to enhance power. The quest for gene-gene interactions poses severe multiple-testing problems. In this context, the maxT algorithm is one technique to control the false-positive rate. However, the memory needed by this algorithm rises linearly with the amount of hypothesis tests. Gene-gene interaction studies will require a memory proportional to the squared number of SNPs. A genome-wide epistasis search would therefore require terabytes of memory. Hence, cache problems are likely to occur, increasing the computation time. In this work we present a new version of maxT, requiring an amount of memory independent from the number of genetic effects to be investigated. This algorithm was implemented in C++ in our epistasis screening software MBMDR-3.0.3. We evaluate the new implementation in terms of memory efficiency and speed using simulated data. The software is illustrated on real-life data for Crohn’s disease. Results: In the case of a binary (affected/unaffected) trait, the parallel workflow of MBMDR-3.0.3 analyzes all gene-gene interactions with a dataset of 100,000 SNPs typed on 1000 individuals within 4 days and 9 hours, using 999 permutations of the trait to assess statistical significance, on a cluster composed of 10 blades, containing each four Quad-Core AMD Opteron(tm) Processor 2352 2.1 GHz. In the case of a continuous trait, a similar run takes 9 days. Our program found 14 SNP-SNP interactions with a multiple-testing corrected p-value of less than 0.05 on real-life Crohn’s disease (CD) data. Conclusions: Our software is the first implementation of the MB-MDR methodology able to solve large-scale SNP-SNP interactions problems within a few days, without using much memory, while adequately controlling the type I error rates. A new implementation to reach genome-wide epistasis screening is under construction. In the context of Crohn’s disease, MBMDR-3.0.3 could identify epistasis involving regions that are well known in the field and could be explained from a biological point of view. This demonstrates the power of our software to find relevant phenotype-genotype higher-order associations.
Resumo:
The relationship between inflammation and cancer is well established in several tumor types, including bladder cancer. We performed an association study between 886 inflammatory-gene variants and bladder cancer risk in 1,047 cases and 988 controls from the Spanish Bladder Cancer (SBC)/EPICURO Study. A preliminary exploration with the widely used univariate logistic regression approach did not identify any significant SNP after correcting for multiple testing. We further applied two more comprehensive methods to capture the complexity of bladder cancer genetic susceptibility: Bayesian Threshold LASSO (BTL), a regularized regression method, and AUC-Random Forest, a machine-learning algorithm. Both approaches explore the joint effect of markers. BTL analysis identified a signature of 37 SNPs in 34 genes showing an association with bladder cancer. AUC-RF detected an optimal predictive subset of 56 SNPs. 13 SNPs were identified by both methods in the total population. Using resources from the Texas Bladder Cancer study we were able to replicate 30% of the SNPs assessed. The associations between inflammatory SNPs and bladder cancer were reexamined among non-smokers to eliminate the effect of tobacco, one of the strongest and most prevalent environmental risk factor for this tumor. A 9 SNP-signature was detected by BTL. Here we report, for the first time, a set of SNP in inflammatory genes jointly associated with bladder cancer risk. These results highlight the importance of the complex structure of genetic susceptibility associated with cancer risk.
Resumo:
En la actualidad, las cooperativas recolectan, seleccionan, tratan y separan la fruta según su calibre (peso, diámetro máximo, medio y/o mínimo) para que esta llegue al consumidor final según la categoría (calibre). Para poder competir en un mercado cada vez más exigente en calidad y precios, se requieren sistemas de clasificación automáticos que nos permitan obtener óptimos resultados con altos niveles de producción y productividad. Para realizar estas tareas existen calibradoras industriales que pesan la fruta mediante células de carga y con el peso obtenido las clasifican asignando las piezas a su salida correspondiente (mesa de confección) a través de un sistema de electroimanes. Desafortunadamente el proceso de calibración de la fruta por peso no es en absoluto fiable ya que en este proceso no se considera el grosor de la piel, contenido de agua, de azúcar u otros factores altamente relevantes que influyen considerablemente en los resultados finales. El objeto de este proyecto es el de evolucionar las existentes calibradoras de fruta instalando un sistema industrial de visión artificial (rápido y robusto) que trabaje en un rango de espectro Infrarrojo (mayor fiabilidad) proporcionando óptimos resultados finales en la clasificación de las frutas, verduras y hortalizas. De este modo, el presente proyecto ofrece la oportunidad de mejorar el rendimiento de la línea de clasificación de frutas, aumentando la velocidad, disminuyendo pérdidas en tiempo y error humano y mejorando indiscutiblemente la calidad del producto final deseada por los consumidores.
Resumo:
OBJECTIVE: The purpose of this study was to adapt and improve a minimally invasive two-step postmortem angiographic technique for use on human cadavers. Detailed mapping of the entire vascular system is almost impossible with conventional autopsy tools. The technique described should be valuable in the diagnosis of vascular abnormalities. MATERIALS AND METHODS: Postmortem perfusion with an oily liquid is established with a circulation machine. An oily contrast agent is introduced as a bolus injection, and radiographic imaging is performed. In this pilot study, the upper or lower extremities of four human cadavers were perfused. In two cases, the vascular system of a lower extremity was visualized with anterograde perfusion of the arteries. In the other two cases, in which the suspected cause of death was drug intoxication, the veins of an upper extremity were visualized with retrograde perfusion of the venous system. RESULTS: In each case, the vascular system was visualized up to the level of the small supplying and draining vessels. In three of the four cases, vascular abnormalities were found. In one instance, a venous injection mark engendered by the self-administration of drugs was rendered visible by exudation of the contrast agent. In the other two cases, occlusion of the arteries and veins was apparent. CONCLUSION: The method described is readily applicable to human cadavers. After establishment of postmortem perfusion with paraffin oil and injection of the oily contrast agent, the vascular system can be investigated in detail and vascular abnormalities rendered visible.