987 resultados para automation roadmap
Resumo:
Els serveis d'obtenció de documents i préstec interbibliotecari constitueixen una peça clau dins de les biblioteques modernes. Moltes de les noves tecnologies han estat decisivas en la dinamització dels seus processes i en la reducció del temps de resposta. El correu electronic ha estat una de les principals innovacions tant pel que fa a la tramesa de les comandes com a la informació que es dona als seus usuaris. En el present article s'analitzen els diferents mitjans de localització de documents des deis tradicionals catálegs en paper o CD-ROM, fins a l'accés en línia. Es descriuen també les diferents possibilitats de recuperació d'aquests documents, en especial totes aquelles noves com la transferencia de fitxers o la descarrega en línia així com els servéis de valr afegit com ara la distribució electrónica de sumaris. Finalment es realitza una petita descripció i comparado dels principals subministradors actuals, entre ells la British Library, INIST, UNCOVER, EBSCODOC, OCLC, KNAW, UMI, ISI, etc.
Resumo:
El objetivo de la reseña es presentar GenIsisWeb, un programa que asiste en la publicación de bases de datos en Internet. Tras una breve descripción técnica y partiendo de una base de datos previamente creada con CDS/ISIS, se muestra el proceso completo para su publicación en el Web.
Resumo:
Las bibliotecas, para poder sobrevivir, han de adaptarse a las demandas de los usuarios, de una manera más flexible, sensible y eficaz. Los nuevos desarrollos tecnológicos ayudan a la biblioteca a adecuarse a sus nuevas necesidades, y es por ello por lo que más frecuentemente se produce el cambio de un sistema automatizado a otro. La implementación de un segundo sistema es un proceso más complejo, debido a las nuevas prestaciones que se incorporan y al tener que migrar los datos ya existentes en el primer sistema. En este artículo se analiza el por qué, el cómo y el cuándo del cambio de sistema, así como las líneas de actuación previas que se han de seguir antes de iniciar el cambio: objetivos, análisis de necesidades, definición de especificaciones técnicas y evaluación. Una vez elegido el producto, se inicia la gestión del cambio que involucra tanto a la dirección para que lo controle, como al personal que actúa como agente del cambio. Por último, se analiza el papel tan importante que juega, en el éxito del nuevo sistema, la participación del personal, su formación y los canales de comunicación que se establezcan en la biblioteca.
Resumo:
In the past decade, a number of trends have come together in the general sphere of computing that have profoundly affected libraries. The popularisation of the Internet, the appearance of open and interoperable systems, the improvements within graphics and multimedia, and the generalised installation of LANs are some of the events of the period. Taken together, the result has been that libraries have undergone an important functional change, representing the switch from simple information depositories to information disseminators. Integrated library management systems have not remained unaffected by this transformation and those that have not adapted to the new technological surroundings are now referred to as legacy systems. The article describes the characteristics of systems existing in today's market and outlines future trends that, according to various authors, include the disappearance of the integrated library management systems that have traditionally been sold.
Resumo:
Soil penetration resistance (PR) and the tensile strength of aggregates (TS) are commonly used to characterize the physical and structural conditions of agricultural soils. This study aimed to assess the functionality of a dynamometry apparatus by linear speed and position control automation of its mobile base to measure PR and TS. The proposed equipment was used for PR measurement in undisturbed samples of a clayey "Nitossolo Vermelho eutroférrico" (Kandiudalfic Eutrudox) under rubber trees sampled in two positions (within and between rows). These samples were also used to measure the volumetric soil water content and bulk density, and determine the soil resistance to penetration curve (SRPC). The TS was measured in a sandy loam "Latossolo Vermelho distrófico" (LVd) - Typic Haplustox - and in a very clayey "Nitossolo Vermelho distroférrico" (NVdf) - Typic Paleudalf - under different uses: LVd under "annual crops" and "native forest", NVdf under "annual crops" and "eucalyptus plantation" (> 30 years old). To measure TS, different strain rates were applied using two dynamometry testing devices: a reference machine (0.03 mm s-1), which has been widely used in other studies, and the proposed equipment (1.55 mm s-1). The determination coefficient values of the SRPC were high (R² > 0.9), regardless of the sampling position. Mean TS values in LVd and NVdf obtained with the proposed equipment did not differ (p > 0.05) from those of the reference testing apparatus, regardless of land use and soil type. Results indicate that PR and TS can be measured faster and accurately by the proposed procedure.
Resumo:
In this thesis, we study the use of prediction markets for technology assessment. We particularly focus on their ability to assess complex issues, the design constraints required for such applications and their efficacy compared to traditional techniques. To achieve this, we followed a design science research paradigm, iteratively developing, instantiating, evaluating and refining the design of our artifacts. This allowed us to make multiple contributions, both practical and theoretical. We first showed that prediction markets are adequate for properly assessing complex issues. We also developed a typology of design factors and design propositions for using these markets in a technology assessment context. Then, we showed that they are able to solve some issues related to the R&D portfolio management process and we proposed a roadmap for their implementation. Finally, by comparing the instantiation and the results of a multi-criteria decision method and a prediction market, we showed that the latter are more efficient, while offering similar results. We also proposed a framework for comparing forecasting methods, to identify the constraints based on contingency factors. In conclusion, our research opens a new field of application of prediction markets and should help hasten their adoption by enterprises. Résumé français: Dans cette thèse, nous étudions l'utilisation de marchés de prédictions pour l'évaluation de nouvelles technologies. Nous nous intéressons plus particulièrement aux capacités des marchés de prédictions à évaluer des problématiques complexes, aux contraintes de conception pour une telle utilisation et à leur efficacité par rapport à des techniques traditionnelles. Pour ce faire, nous avons suivi une approche Design Science, développant itérativement plusieurs prototypes, les instanciant, puis les évaluant avant d'en raffiner la conception. Ceci nous a permis de faire de multiples contributions tant pratiques que théoriques. Nous avons tout d'abord montré que les marchés de prédictions étaient adaptés pour correctement apprécier des problématiques complexes. Nous avons également développé une typologie de facteurs de conception ainsi que des propositions de conception pour l'utilisation de ces marchés dans des contextes d'évaluation technologique. Ensuite, nous avons montré que ces marchés pouvaient résoudre une partie des problèmes liés à la gestion des portes-feuille de projets de recherche et développement et proposons une feuille de route pour leur mise en oeuvre. Finalement, en comparant la mise en oeuvre et les résultats d'une méthode de décision multi-critère et d'un marché de prédiction, nous avons montré que ces derniers étaient plus efficaces, tout en offrant des résultats semblables. Nous proposons également un cadre de comparaison des méthodes d'évaluation technologiques, permettant de cerner au mieux les besoins en fonction de facteurs de contingence. En conclusion, notre recherche ouvre un nouveau champ d'application des marchés de prédiction et devrait permettre d'accélérer leur adoption par les entreprises.
Resumo:
Integrated approaches using different in vitro methods in combination with bioinformatics can (i) increase the success rate and speed of drug development; (ii) improve the accuracy of toxicological risk assessment; and (iii) increase our understanding of disease. Three-dimensional (3D) cell culture models are important building blocks of this strategy which has emerged during the last years. The majority of these models are organotypic, i.e., they aim to reproduce major functions of an organ or organ system. This implies in many cases that more than one cell type forms the 3D structure, and often matrix elements play an important role. This review summarizes the state of the art concerning commonalities of the different models. For instance, the theory of mass transport/metabolite exchange in 3D systems and the special analytical requirements for test endpoints in organotypic cultures are discussed in detail. In the next part, 3D model systems for selected organs--liver, lung, skin, brain--are presented and characterized in dedicated chapters. Also, 3D approaches to the modeling of tumors are presented and discussed. All chapters give a historical background, illustrate the large variety of approaches, and highlight up- and downsides as well as specific requirements. Moreover, they refer to the application in disease modeling, drug discovery and safety assessment. Finally, consensus recommendations indicate a roadmap for the successful implementation of 3D models in routine screening. It is expected that the use of such models will accelerate progress by reducing error rates and wrong predictions from compound testing.
Resumo:
The aim of this study was to evaluate the forensic protocol recently developed by Qiagen for the QIAsymphony automated DNA extraction platform. Samples containing low amounts of DNA were specifically considered, since they represent the majority of samples processed in our laboratory. The analysis of simulated blood and saliva traces showed that the highest DNA yields were obtained with the maximal elution volume available for the forensic protocol, that is 200 ml. Resulting DNA extracts were too diluted for successful DNA profiling and required a concentration. This additional step is time consuming and potentially increases inversion and contamination risks. The 200 ml DNA extracts were concentrated to 25 ml, and the DNA recovery estimated with real-time PCR as well as with the percentage of SGM Plus alleles detected. Results using our manual protocol, based on the QIAamp DNA mini kit, and the automated protocol were comparable. Further tests will be conducted to determine more precisely DNA recovery, contamination risk and PCR inhibitors removal, once a definitive procedure, allowing the concentration of DNA extracts from low yield samples, will be available for the QIAsymphony.
Resumo:
Because of the increase in workplace automation and the diversification of industrial processes, workplaces have become more and more complex. The classical approaches used to address workplace hazard concerns, such as checklists or sequence models, are, therefore, of limited use in such complex systems. Moreover, because of the multifaceted nature of workplaces, the use of single-oriented methods, such as AEA (man oriented), FMEA (system oriented), or HAZOP (process oriented), is not satisfactory. The use of a dynamic modeling approach in order to allow multiple-oriented analyses may constitute an alternative to overcome this limitation. The qualitative modeling aspects of the MORM (man-machine occupational risk modeling) model are discussed in this article. The model, realized on an object-oriented Petri net tool (CO-OPN), has been developed to simulate and analyze industrial processes in an OH&S perspective. The industrial process is modeled as a set of interconnected subnets (state spaces), which describe its constitutive machines. Process-related factors are introduced, in an explicit way, through machine interconnections and flow properties. While man-machine interactions are modeled as triggering events for the state spaces of the machines, the CREAM cognitive behavior model is used in order to establish the relevant triggering events. In the CO-OPN formalism, the model is expressed as a set of interconnected CO-OPN objects defined over data types expressing the measure attached to the flow of entities transiting through the machines. Constraints on the measures assigned to these entities are used to determine the state changes in each machine. Interconnecting machines implies the composition of such flow and consequently the interconnection of the measure constraints. This is reflected by the construction of constraint enrichment hierarchies, which can be used for simulation and analysis optimization in a clear mathematical framework. The use of Petri nets to perform multiple-oriented analysis opens perspectives in the field of industrial risk management. It may significantly reduce the duration of the assessment process. But, most of all, it opens perspectives in the field of risk comparisons and integrated risk management. Moreover, because of the generic nature of the model and tool used, the same concepts and patterns may be used to model a wide range of systems and application fields.
Resumo:
Underbody plows can be very useful tools in winter maintenance, especially when compacted snow or hard ice must be removed from the roadway. By the application of significant down-force, and the use of an appropriate cutting edge angle, compacted snow and ice can be removed very effectively by such plows, with much greater efficiency than any other tool under those circumstances. However, the successful operation of an underbody plow requires considerable skill. If too little down pressure is applied to the plow, then it will not cut the ice or compacted snow. However, if too much force is applied, then either the cutting edge may gouge the road surface, causing significant damage often to both the road surface and the plow, or the plow may ride up on the cutting edge so that it is no longer controllable by the operator. Spinning of the truck in such situations is easily accomplished. Further, excessive down force will result in rapid wear of the cutting edge. Given this need for a high level of operator skill, the operation of an underbody plow is a candidate for automation. In order to successfully automate the operation of an underbody plow, a control system must be developed that follows a set of rules that represent appropriate operation of such a plow. These rules have been developed, based upon earlier work in which operational underbody plows were instrumented to determine the loading upon them (both vertical and horizontal) and the angle at which the blade was operating.These rules have been successfully coded into two different computer programs, both using the MatLab® software. In the first program, various load and angle inputs are analyzed to determine when, whether, and how they violate the rules of operation. This program is essentially deterministic in nature. In the second program, the Simulink® package in the MatLab® software system was used to implement these rules using fuzzy logic. Fuzzy logic essentially replaces a fixed and constant rule with one that varies in such a way as to improve operational control. The development of the fuzzy logic in this simulation was achieved simply by using appropriate routines in the computer software, rather than being developed directly. The results of the computer testing and simulation indicate that a fully automated, computer controlled underbody plow is indeed possible. The issue of whether the next steps toward full automation should be taken (and by whom) has also been considered, and the possibility of some sort of joint venture between a Department of Transportation and a vendor has been suggested.
Resumo:
Many would argue that the dramatic rise in autism has reached critical mass, and this council echoes that statement. Iowa, like many states in the nation, is currently ill equipped to handle the large influx of children and adults with autism. When this council was initially formed we were facing diagnosis rates of 1 in 150 and currently the diagnosis rate is 1 in 91. Current resource strains in education, qualified trained professionals, access to care, and financial services are rapidly deteriorating Iowa’s ability to deliver quality services to children, adults, and families affected by autism. If Iowa leadership fails to act quickly the already strained system will face a breaking point in the following areas: financing, coordination of care, educational resources, early identification, adult services, and access to service delivery - just to name a few. This council has taken the past 12 plus months hearing testimony from state officials, providers, and caregivers to ensure that care for those with autism is effective, cost efficient, and accessible. This council will be making recommendations on three major areas; early identification, seamless support/coordination of care, and financing of care. While these areas will be highlighted in this first annual report it in no way minimizes other areas that need to be addressed such as early intervention, special education, training, in-home support services, financing options, and data collection. Implementing the initial recommendations of this council will lay foundational support for the areas mentioned above. Often those in position to help ask what can be done to help families in Iowa. This council has provided a roadmap to help facilitate effective and proven treatments to children and adults with autism.
Resumo:
The goal of this study was to compare the quantity and purity of DNA extracted from biological tracesusing the QIAsymphony robot with that of the manual QIAamp DNA mini kit currently in use in ourlaboratory. We found that the DNA yield of robot was 1.6-3.5 times lower than that of the manualprotocol. This resulted in a loss of 8% and 29% of the alleles correctly scored when analyzing 1/400 and 1/800 diluted saliva samples, respectively. Specific tests showed that the QIAsymphony was at least 2-16times more efficient at removing PCR inhibitors. The higher purity of the DNA may therefore partlycompensate for the lower DNA yield obtained. No case of cross-contamination was observed amongsamples. After purification with the robot, DNA extracts can be automatically transferred in 96-wellsplates, which is an ideal format for subsequent RT-qPCR quantification and DNA amplification. Lesshands-on time and reduced risk of operational errors represent additional advantages of the robotic platform.
Resumo:
The present research deals with an important public health threat, which is the pollution created by radon gas accumulation inside dwellings. The spatial modeling of indoor radon in Switzerland is particularly complex and challenging because of many influencing factors that should be taken into account. Indoor radon data analysis must be addressed from both a statistical and a spatial point of view. As a multivariate process, it was important at first to define the influence of each factor. In particular, it was important to define the influence of geology as being closely associated to indoor radon. This association was indeed observed for the Swiss data but not probed to be the sole determinant for the spatial modeling. The statistical analysis of data, both at univariate and multivariate level, was followed by an exploratory spatial analysis. Many tools proposed in the literature were tested and adapted, including fractality, declustering and moving windows methods. The use of Quan-tité Morisita Index (QMI) as a procedure to evaluate data clustering in function of the radon level was proposed. The existing methods of declustering were revised and applied in an attempt to approach the global histogram parameters. The exploratory phase comes along with the definition of multiple scales of interest for indoor radon mapping in Switzerland. The analysis was done with a top-to-down resolution approach, from regional to local lev¬els in order to find the appropriate scales for modeling. In this sense, data partition was optimized in order to cope with stationary conditions of geostatistical models. Common methods of spatial modeling such as Κ Nearest Neighbors (KNN), variography and General Regression Neural Networks (GRNN) were proposed as exploratory tools. In the following section, different spatial interpolation methods were applied for a par-ticular dataset. A bottom to top method complexity approach was adopted and the results were analyzed together in order to find common definitions of continuity and neighborhood parameters. Additionally, a data filter based on cross-validation was tested with the purpose of reducing noise at local scale (the CVMF). At the end of the chapter, a series of test for data consistency and methods robustness were performed. This lead to conclude about the importance of data splitting and the limitation of generalization methods for reproducing statistical distributions. The last section was dedicated to modeling methods with probabilistic interpretations. Data transformation and simulations thus allowed the use of multigaussian models and helped take the indoor radon pollution data uncertainty into consideration. The catego-rization transform was presented as a solution for extreme values modeling through clas-sification. Simulation scenarios were proposed, including an alternative proposal for the reproduction of the global histogram based on the sampling domain. The sequential Gaussian simulation (SGS) was presented as the method giving the most complete information, while classification performed in a more robust way. An error measure was defined in relation to the decision function for data classification hardening. Within the classification methods, probabilistic neural networks (PNN) show to be better adapted for modeling of high threshold categorization and for automation. Support vector machines (SVM) on the contrary performed well under balanced category conditions. In general, it was concluded that a particular prediction or estimation method is not better under all conditions of scale and neighborhood definitions. Simulations should be the basis, while other methods can provide complementary information to accomplish an efficient indoor radon decision making.
Resumo:
La Gimnàstica Estètica de Grup (GEG) és un esport emergent del qual no existeix gairebé cap treball de camp i/o publicació. En relació al codi de puntuació d’aquesta modalitat, tant les capacitats de salt com la unitat de moviment del cos i la sincronització entre els membres del conjunt, tenen un pes molt important en la puntuació del valor tècnic i de l’execució. En aquest estudi s’ha realitzat la mesura, avaluació i comparació de les manifestacions de la força explosiva, elàstica i reactiva d’un grup de gimnàstica d’estètica d’alt nivell al principi i al final del període competitiu, mitjançant la bateria de tests de salts verticals de Bosco, concretament SJ, CMJ, CMJas i RJ (15” CMJas). També s’ha analitzat la sincronització i/o coordinació temporal intergrupal d’execució de les dificultats tècniques de salt de les coreografies competitives, al llarg del període competitiu d’un conjunt de gimnàstica estètica d’alt nivell, tenint en compte la sincronització en començar la dificultat i en acabar-la. Els resultats obtinguts demostren que la manifestació de força elàsticoexplosiva en CMJ ha disminuït un 0,46 % i la força explosiva SJ (sense reutilització d'energia elàstica ni aprofitament del reflex miotàtic) ha augmentat un 4,63 %. Durant el període competitiu del conjunt sènior de gimnàstica estètica del Club Muntanyenc Sant Cugat, la influència dels braços en la capacitat de salt ha augmentat un 1,32% i la potència anaeròbica alàctica un 4,76%. Tot i que en la majoria de tests, els resultats han estat positius, no es considera que la mostra hagi assolit una millora significativa, atès que no ha superat el 10% proposat en començar l’estudi, i els valors obtinguts són totalment inestables. S’ha vist que en un mateix test el % de pèrdues i de guanys ha estat molt variat, de manera que no es pot establir una relació de millora de la capacitat de salt en funció de l’entrenament. Pel que fa a la sincronització temporal intergrupal, ha millorat entre un 37,50% (sincronització temps inicial) i un 50,00% (sincronització temps final) en relació a les dificultats tècniques. Fet que és relaciona directament amb l’automatització de mecanismes d’execució al llarg de la temporada competitiva. Tot i així no s’ha igualat o superat la millora d’un 70% proposada per les hipòtesis inicials de l’estudi.