925 resultados para Integrated library system
Resumo:
A dominant account of perseverative errors in early development contends that such errors reflect a failure to inhibit a prepotent response. This study investigated whether perseveration might also arise from a failure to inhibit a prepotent representation. Children watched as a toy was hidden at an A location, waited during a delay, and then watched the experimenter find the toy. After six observation-only A trials, the toy was hidden at a B location, and children were allowed to search for the toy. Two- and 4-year-olds’ responses on the B trials were significantly biased toward A even though they had never overtly responded to this location. Thus, perseverative biases in early development can arise as a result of prepotent representations, demonstrating that the prepotent-response account is incomplete. We discuss three alternative interpretations of these results, including the possibility that representational and response-based biases reflect the operation of a single, integrated behavioral system.
Resumo:
Each plasma physics laboratory has a proprietary scheme to control and data acquisition system. Usually, it is different from one laboratory to another. It means that each laboratory has its own way to control the experiment and retrieving data from the database. Fusion research relies to a great extent on international collaboration and this private system makes it difficult to follow the work remotely. The TCABR data analysis and acquisition system has been upgraded to support a joint research programme using remote participation technologies. The choice of MDSplus (Model Driven System plus) is proved by the fact that it is widely utilized, and the scientists from different institutions may use the same system in different experiments in different tokamaks without the need to know how each system treats its acquisition system and data analysis. Another important point is the fact that the MDSplus has a library system that allows communication between different types of language (JAVA, Fortran, C, C++, Python) and programs such as MATLAB, IDL, OCTAVE. In the case of tokamak TCABR interfaces (object of this paper) between the system already in use and MDSplus were developed, instead of using the MDSplus at all stages, from the control, and data acquisition to the data analysis. This was done in the way to preserve a complex system already in operation and otherwise it would take a long time to migrate. This implementation also allows add new components using the MDSplus fully at all stages. (c) 2012 Elsevier B.V. All rights reserved.
Resumo:
O Modelo de Gestão do Sistema Integrado de Bibliote cas da Universidade de São Paulo (SIBi/USP) incorpora conceitos e ferramentas de ger enciamento que permeiam as organizações modernas para garantir melhores índice s de desempenho sistêmico e prestar serviços com qualidade e eficiência aos usuários. C omo parte do Modelo está a identificação e a descrição detalhada dos processos de trabalho do Sistema, assim como o estabelecimento de alguns indicadores de desempenho . A partir do trabalho inicial, foi elaborado um levantamento complementar para identif icação dos processos que abrangessem o conjunto do Sistema de forma ampla. O s dados foram dispostos em planilhas para melhor visualização, especialmente q uanto aos macro processos, processos, sub processos e atividades. Os processos foram sepa rados em essenciais, gerenciais e de apoio, além de elencadas as atividades pertinentes a cada um deles, bem como as instruções técnicas e fluxos de trabalho. Foram est abelecidos alguns indicadores, tendo por referência os da IFLA já estudados por outro Grupo de Trabalho. Daquele estudo quatro indicadores foram testados e validados pelo SIBi/US P por meio de aplicação piloto em algumas das Bibliotecas do Sistema e outros foram d efinidos no estudo atual. Com isso foi possível mapear os processos e as atividades desenv olvidas pelo conjunto de Bibliotecas sendo que cada Biblioteca, em função de sua especia lidade e especificidade pode adequar o seu mapeamento. A definição de um núcleo básico d e indicadores objetiva viabilizar a concretização da missão e dos objetivos em consonân cia com a política do SIBI/USP
Resumo:
To compare the 1-year cost-effectiveness of therapeutic assertive community treatment (ACT) with standard care in schizophrenia. ACT was specifically developed for patients with schizophrenia, delivered by psychosis experts highly trained in respective psychotherapies, and embedded into an integrated care system.
Resumo:
OBJECTIVES We sought to evaluate the strategy success and short term clinical outcomes of direct stenting via 5 French (F) diagnostic catheters using a novel bare metal stent with integrated delivery system (IDS) (Svelte Medical Systems, New Providence, NJ) and compare the results to a conventionally treated matched group. METHODS Fifteen consecutive patients with lesions deemed suitable for direct stenting using a bare metal stent were included. The primary endpoint was the strategy success defined as the ability to successfully treat a target lesion via a 5 F diagnostic catheter with a good angiographic result (TIMI III flow, residual stenosis ≤20%). Procedure and fluoroscopy times, contrast agent use, cost, and short-term clinical outcomes were compared to a matched group treated via conventional stenting. RESULTS The primary endpoint was reached in 14/15 patients (93%). There were no significant differences in procedural (58.6 min ± 12.7 vs. 57.4 min ± 14.2) or fluoroscopy times (10.0 min ± 4.3 vs.10.1 min ± 3.9) or contrast agent use (193.7 ml ± 54.8 vs. 181.4 ml ± 35.6). However, there were significant reductions in materials used in the study group compared to the control group equating to cost savings of almost US $600 per case (US $212.44 ± 258.09 vs. US $804.69 ± 468.11; P = 0.001). CONCLUSIONS Direct stenting using a novel bare metal stent with an IDS via 5 F diagnostic catheters is a viable alternative to conventional stenting in selected patients and is associated with significant cost savings.
Resumo:
Introduction and objective. A number of prognostic factors have been reported for predicting survival in patients with renal cell carcinoma. Yet few studies have analyzed the effects of those factors at different stages of the disease process. In this study, different stages of disease progression starting from nephrectomy to metastasis, from metastasis to death, and from evaluation to death were evaluated. ^ Methods. In this retrospective follow-up study, records of 97 deceased renal cell carcinoma (RCC) patients were reviewed between September 2006 to October 2006. Patients with TNM Stage IV disease before nephrectomy or with cancer diagnoses other than RCC were excluded leaving 64 records for analysis. Patient TNM staging, Furhman Grade, age, tumor size, tumor volume, histology and patient gender were analyzed in relation to time to metastases. Time from nephrectomy to metastasis, TNM staging, Furhman Grade, age, tumor size, tumor volume, histology and patient gender were tested for significance in relation to time from metastases to death. Finally, analysis of laboratory values at time of evaluation, Eastern Cooperative Oncology Group performance status (ECOG), UCLA Integrated Staging System (UISS), time from nephrectomy to metastasis, TNM staging, Furhman Grade, age, tumor size, tumor volume, histology and patient gender were tested for significance in relation to time from evaluation to death. Linear regression and Cox Proportional Hazard (univariate and multivariate) was used for testing significance. Kaplan-Meier Log-Rank test was used to detect any significance between groups at various endpoints. ^ Results. Compared to negative lymph nodes at time of nephrectomy, a single positive lymph node had significantly shorter time to metastasis (p<0.0001). Compared to other histological types, clear cell histology had significant metastasis free survival (p=0.003). Clear cell histology compared to other types (p=0.0002 univariate, p=0.038 multivariate) and time to metastasis with log conversion (p=0.028) significantly affected time from metastasis to death. A greater than one year and greater than two year metastasis free interval, compared to patients that had metastasis before one and two years, had statistically significant survival benefit (p=0.004 and p=0.0318). Time from evaluation to death was affected by greater than one year metastasis free interval (p=0.0459), alcohol consumption (p=0.044), LDH (p=0.006), ECOG performance status (p<0.001), and hemoglobin level (p=0.0092). The UISS risk stratified the patient population in a statistically significant manner for survival (p=0.001). No other factors were found to be significant. ^ Conclusion. Clear cell histology is predictive for both time to metastasis and metastasis to death. Nodal status at time of nephrectomy may predict risk of metastasis. The time interval to metastasis significantly predicts time from metastasis to death and time from evaluation to death. ECOG performance status, and hemoglobin levels predicts survival outcome at evaluation. Finally, UISS appropriately stratifies risk in our population. ^
Resumo:
An integrated anaerobic-aerobic treatment system of sulphate-laden wastewater was proposed here to achieve low sludge production, low energy consumption and effective sulphide control. Before integrating the whole system, the feasibility of autotrophic denitrification utilising dissolved sulphide produced during anaerobic treatment of sulphate rich wastewater was studied here. An upflow anaerobic sludge blanket reactor was operated to treat sulphate-rich synthetic wastewater (TOC = 100 mg/L and sulphate = 500 mg/L) and its effluent with dissolved sulphide and external nitrate solution were fed into an anoxic biofilter. The anaerobic reactor was able to remove 77-85% of TOC at HRT of 3 h and produce 70-90 mg S/L sulphide in dissolved form for the subsequent denitrification. The performance of anoxic reactor was stable, and the anoxic reactor could remove 30 mg N/L nitrate at HRT of 2 h through autotrophic denitrification. Furthermore, sulphur balance for the anoxic filter showed that more than 90% of the removed sulphide was actually oxidised into sulphate, thereby there was no accumulation of sulphur particles in the filter bed. The net sludge productions were approximately 0.15 to 0.18 g VSS/g COD in the anaerobic reactor and 0.22 to 0.31 g VSS/g NO3--N in the anoxic reactor. The findings in this study will be helpful in developing the integrated treatment system to achieve low-cost excess sludge minimisation.
Resumo:
Many manufacturing companies have long endured the problems associated with the presence of `islands of automation'. Due to rapid computerisation, `islands' such as Computer-Aided Design (CAD), Computer-Aided Manufacturing (CAM), Flexible Manufacturing Systems (FMS) and Material Requirement Planning (MRP), have emerged, and with a lack of co-ordination, often lead to inefficient performance of the overall system. The main objective of Computer-Integrated Manufacturing (CIM) technology is to form a cohesive network between these islands. Unfortunately, a commonly used approach - the centralised system approach, has imposed major technical constraints and design complication on development strategies. As a consequence, small companies have experienced difficulties in participating in CIM technology. The research described in this thesis has aimed to examine alternative approaches to CIM system design. Through research and experimentation, the cellular system approach, which has existed in the form of manufacturing layouts, has been found to simplify the complexity of an integrated manufacturing system, leading to better control and far higher system flexibility. Based on the cellular principle, some central management functions have also been distributed to smaller cells within the system. This concept is known, specifically, as distributed planning and control. Through the development of an embryo cellular CIM system, the influence of both the cellular principle and the distribution methodology have been evaluated. Based on the evidence obtained, it has been concluded that distributed planning and control methodology can greatly enhance cellular features within an integrated system. Both the cellular system approach and the distributed control concept will therefore make significant contributions to the design of future CIM systems, particularly systems designed with respect to small company requirements.
Resumo:
Current state of Russian databases for substances and materials properties was considered. A brief review of integration methods of given information systems was prepared and a distributed databases integration approach based on metabase was proposed. Implementation details were mentioned on the posed database on electronics materials integration approach. An operating pilot version of given integrated information system implemented at IMET RAS was considered.
Resumo:
To achieve academic success, children with learning-related disabilities often receive special education supports at school. Currently, Canada does not have a federal department or integrated national system of education. Instead, each province and territory has a separate department or ministry that is responsible for the organization and delivery of education, including special education, at the elementary level. At the macro (national) level, inclusive education is the policy across Canada. However, each province and territory has its own legislation, definitions, and policies mandating special education services. These variations result in little consistency at the micro (individual school) level. Differences between eligibility requirements, supports offered, and delivery methods may present challenges for highly mobile families who must navigate new special education systems on behalf of their children with medical or learning challenges. One of the defining features of the Canadian military lifestyle is geographic mobility. As a result, many families are tasked with navigating new school systems for their children, a task that may be more difficult when children require special education services. The purpose of this study is to explore the impact of geographic mobility on Canadian military families and their children’s access to special education services. The secondary objective was to gain insight into supports that helped facilitate access to services, as well as supports that participants believe would have helped facilitate access. A qualitative approach, interpretive phenomenological analysis (IPA), was employed due to of its focus on individuals’ experiences and their understandings of a particular phenomenon. IPA allowed participants to reflect on the significance of their experiences, while the researcher engaged with these reflections to make sense of the meanings associated with their experiences. Nine semi-structured interviews were conducted with civilian caregivers who have a child with special education needs. An interview guide and probes were used to elicit rich, detailed, first-person accounts of their experiences navigating new special education systems. The main themes that emerged from the participants’ combined experiences addressed the emotional components of experiencing a transition, factors that may facilitate access to special education services, and career implications associated with accessing and maintaining special education services. Findings from the study illustrate that Canadian families experience many, and often times severe, barriers to accessing special education services after a posting. Furthermore, the impacts reported throughout the study echo the existing American literature on geographic mobility and access to special education services. Building on the literature, this study also highlights the need for further research exploring factors that create unique barriers to access in a Canadian context, resulting from the current special education climate, military policies, and military family support services.
Resumo:
This work aims to analyze risks related to information technology (IT) in procedures related to data migration. This is done considering ALEPH, Integrated Libray System (ILS) that migrated data to the Library Module present in the software called Sistema Integrado de Gestão de Atividades Acadêmicas (SIGAA) at the Zila Mamede Central Library at the Federal University of Rio Grande do Norte (UFRN) in Natal/Brazil. The methodological procedure used was of a qualitative exploratory research with the realization of case study at the referred library in order to better understand this phenomenon. Data collection was able once there was use of a semi-structured interview that was applied with (11) subjects that are employed at the library as well as in the Technology Superintendence at UFRN. In order to examine data Content analysis as well as thematic review process was performed. After data migration the results of the interview were then linked to both analysis units and their system register with category correspondence. The main risks detected were: data destruction; data loss; data bank communication failure; user response delay; data inconsistency and duplicity. These elements point out implication and generate disorders that affect external and internal system users and lead to stress, work duplicity and hassles. Thus, some measures were taken related to risk management such as adequate planning, central management support, and pilot test simulations. For the advantages it has reduced of: risk, occurrence of problems and possible unforeseen costs, and allows achieving organizational objectives, among other. It is inferred therefore that the risks present in data bank conversion in libraries exist and some are predictable, however, it is seen that librarians do not know or ignore and are not very worried in the identification risks in data bank conversion, their acknowledge would minimize or even extinguish them. Another important aspect to consider is the existence of few empirical research that deal specifically with this subject and thus presenting the new of new approaches in order to promote better understanding of the matter in the corporate environment of the information units
Resumo:
El proceso de toma de decisiones en las bibliotecas universitarias es de suma importancia, sin embargo, se encuentra complicaciones como la gran cantidad de fuentes de datos y los grandes volúmenes de datos a analizar. Las bibliotecas universitarias están acostumbradas a producir y recopilar una gran cantidad de información sobre sus datos y servicios. Las fuentes de datos comunes son el resultado de sistemas internos, portales y catálogos en línea, evaluaciones de calidad y encuestas. Desafortunadamente estas fuentes de datos sólo se utilizan parcialmente para la toma de decisiones debido a la amplia variedad de formatos y estándares, así como la falta de métodos eficientes y herramientas de integración. Este proyecto de tesis presenta el análisis, diseño e implementación del Data Warehouse, que es un sistema integrado de toma de decisiones para el Centro de Documentación Juan Bautista Vázquez. En primer lugar se presenta los requerimientos y el análisis de los datos en base a una metodología, esta metodología incorpora elementos claves incluyendo el análisis de procesos, la calidad estimada, la información relevante y la interacción con el usuario que influyen en una decisión bibliotecaria. A continuación, se propone la arquitectura y el diseño del Data Warehouse y su respectiva implementación la misma que soporta la integración, procesamiento y el almacenamiento de datos. Finalmente los datos almacenados se analizan a través de herramientas de procesamiento analítico y la aplicación de técnicas de Bibliomining ayudando a los administradores del centro de documentación a tomar decisiones óptimas sobre sus recursos y servicios.
Resumo:
El volumen de datos en bibliotecas ha aumentado enormemente en los últimos años, así como también la complejidad de sus fuentes y formatos de información, dificultando su gestión y acceso, especialmente como apoyo en la toma de decisiones. Sabiendo que una buena gestión de bibliotecas involucra la integración de indicadores estratégicos, la implementación de un Data Warehouse (DW), que gestione adecuadamente tal cantidad de información, así como su compleja mezcla de fuentes de datos, se convierte en una alternativa interesante a considerar. El artículo describe el diseño e implementación de un sistema de soporte de decisiones (DSS) basado en técnicas de DW para la biblioteca de la Universidad de Cuenca. Para esto, el estudio utiliza una metodología holística, propuesto por Siguenza-Guzman et al. (2014) para la evaluación integral de bibliotecas. Dicha metodología evalúa la colección y los servicios, incorporando importantes elementos para la gestión de bibliotecas, tales como: el desempeño de los servicios, el control de calidad, el uso de la colección y la interacción con el usuario. A partir de este análisis, se propone una arquitectura de DW que integra, procesa y almacena los datos. Finalmente, estos datos almacenados son analizados y visualizados a través de herramientas de procesamiento analítico en línea (OLAP). Las pruebas iniciales de implementación confirman la viabilidad y eficacia del enfoque propuesto, al integrar con éxito múltiples y heterogéneas fuentes y formatos de datos, facilitando que los directores de bibliotecas generen informes personalizados, e incluso permitiendo madurar los procesos transaccionales que diariamente se llevan a cabo.
Resumo:
This work aims to analyze risks related to information technology (IT) in procedures related to data migration. This is done considering ALEPH, Integrated Libray System (ILS) that migrated data to the Library Module present in the software called Sistema Integrado de Gestão de Atividades Acadêmicas (SIGAA) at the Zila Mamede Central Library at the Federal University of Rio Grande do Norte (UFRN) in Natal/Brazil. The methodological procedure used was of a qualitative exploratory research with the realization of case study at the referred library in order to better understand this phenomenon. Data collection was able once there was use of a semi-structured interview that was applied with (11) subjects that are employed at the library as well as in the Technology Superintendence at UFRN. In order to examine data Content analysis as well as thematic review process was performed. After data migration the results of the interview were then linked to both analysis units and their system register with category correspondence. The main risks detected were: data destruction; data loss; data bank communication failure; user response delay; data inconsistency and duplicity. These elements point out implication and generate disorders that affect external and internal system users and lead to stress, work duplicity and hassles. Thus, some measures were taken related to risk management such as adequate planning, central management support, and pilot test simulations. For the advantages it has reduced of: risk, occurrence of problems and possible unforeseen costs, and allows achieving organizational objectives, among other. It is inferred therefore that the risks present in data bank conversion in libraries exist and some are predictable, however, it is seen that librarians do not know or ignore and are not very worried in the identification risks in data bank conversion, their acknowledge would minimize or even extinguish them. Another important aspect to consider is the existence of few empirical research that deal specifically with this subject and thus presenting the new of new approaches in order to promote better understanding of the matter in the corporate environment of the information units
Resumo:
Libraries since their inception 4000 years ago have been in a process of constant change. Although, changes were in slow motion for centuries, in the last decades, academic libraries have been continuously striving to adapt their services to the ever-changing user needs of students and academic staff. In addition, e-content revolution, technological advances, and ever-shrinking budgets have obliged libraries to efficiently allocate their limited resources among collection and services. Unfortunately, this resource allocation is a complex process due to the diversity of data sources and formats required to be analyzed prior to decision-making, as well as the lack of efficient integration methods. The main purpose of this study is to develop an integrated model that supports libraries in making optimal budgeting and resource allocation decisions among their services and collection by means of a holistic analysis. To this end, a combination of several methodologies and structured approaches is conducted. Firstly, a holistic structure and the required toolset to holistically assess academic libraries are proposed to collect and organize the data from an economic point of view. A four-pronged theoretical framework is used in which the library system and collection are analyzed from the perspective of users and internal stakeholders. The first quadrant corresponds to the internal perspective of the library system that is to analyze the library performance, and costs incurred and resources consumed by library services. The second quadrant evaluates the external perspective of the library system; user’s perception about services quality is judged in this quadrant. The third quadrant analyses the external perspective of the library collection that is to evaluate the impact of the current library collection on its users. Eventually, the fourth quadrant evaluates the internal perspective of the library collection; the usage patterns followed to manipulate the library collection are analyzed. With a complete framework for data collection, these data coming from multiple sources and therefore with different formats, need to be integrated and stored in an adequate scheme for decision support. A data warehousing approach is secondly designed and implemented to integrate, process, and store the holistic-based collected data. Ultimately, strategic data stored in the data warehouse are analyzed and implemented for different purposes including the following: 1) Data visualization and reporting is proposed to allow library managers to publish library indicators in a simple and quick manner by using online reporting tools. 2) Sophisticated data analysis is recommended through the use of data mining tools; three data mining techniques are examined in this research study: regression, clustering and classification. These data mining techniques have been applied to the case study in the following manner: predicting the future investment in library development; finding clusters of users that share common interests and similar profiles, but belong to different faculties; and predicting library factors that affect student academic performance by analyzing possible correlations of library usage and academic performance. 3) Input for optimization models, early experiences of developing an optimal resource allocation model to distribute resources among the different processes of a library system are documented in this study. Specifically, the problem of allocating funds for digital collection among divisions of an academic library is addressed. An optimization model for the problem is defined with the objective of maximizing the usage of the digital collection over-all library divisions subject to a single collection budget. By proposing this holistic approach, the research study contributes to knowledge by providing an integrated solution to assist library managers to make economic decisions based on an “as realistic as possible” perspective of the library situation.