949 resultados para Cadastral updating
Resumo:
A series of MCM-22/ZSM-35 composites has been hydrothermally synthesized and characterized by XRD, SEM, particle size distribution analysis, N-2 adsorption and NH3-TPD techniques. Pulse and continuous flow reactions were carried out to evaluate the catalytic performances of these composites in aromatization of olefins, respectively. It was found that MCM-22/ZSM-35 composites could be rapidly crystallized at 174 degrees C with an optimal gel composition of SiO2/Al2O3=25, Na2O/SiO2=0.11, HMI/SiO2=0.35, and H2O/SiO2=45 (molar ratio), of which the weight ratio of ZSM-35 zeolite in the composite relied on the crystallization time. The coexistence of MCM-22 and ZSM-35 in the composite (MCM-22/ZSM-35=45/55 wt/wt) was observed to exert a notable synergistic effect on the aromatization ability for butene conversion and FCC gasoline updating, possibly due to the intergrowth of some MCM-22 and ZSM-35 layers.
Resumo:
Durbin, J. & Urquhart, C. (2003). Qualitative evaluation of KA24 (Knowledge Access 24). Aberystwyth: Department of Information Studies, University of Wales Aberystwyth. Sponsorship: Knowledge Access 24 (NHS)
Resumo:
Tedd, L.A., Dahl, K., Francis, S.,Tet?evov?, M.& ?ihlavn?kov?, E.(2002).Training for professional librarians in Slovakia by distance-learning methods: an overview of the PROLIB and EDULIB projects. Library Hi Tech, 20(3), 340-351. Sponsorship: European Union and the Open Society Institute
Resumo:
Projeto de Pós-Graduação/Dissertação apresentado à Universidade Fernando Pessoa como parte dos requisitos para obtenção do grau de Mestre em Ciências Farmacêuticas
Resumo:
The proliferation of inexpensive workstations and networks has created a new era in distributed computing. At the same time, non-traditional applications such as computer-aided design (CAD), computer-aided software engineering (CASE), geographic-information systems (GIS), and office-information systems (OIS) have placed increased demands for high-performance transaction processing on database systems. The combination of these factors gives rise to significant challenges in the design of modern database systems. In this thesis, we propose novel techniques whose aim is to improve the performance and scalability of these new database systems. These techniques exploit client resources through client-based transaction management. Client-based transaction management is realized by providing logging facilities locally even when data is shared in a global environment. This thesis presents several recovery algorithms which utilize client disks for storing recovery related information (i.e., log records). Our algorithms work with both coarse and fine-granularity locking and they do not require the merging of client logs at any time. Moreover, our algorithms support fine-granularity locking with multiple clients permitted to concurrently update different portions of the same database page. The database state is recovered correctly when there is a complex crash as well as when the updates performed by different clients on a page are not present on the disk version of the page, even though some of the updating transactions have committed. This thesis also presents the implementation of the proposed algorithms in a memory-mapped storage manager as well as a detailed performance study of these algorithms using the OO1 database benchmark. The performance results show that client-based logging is superior to traditional server-based logging. This is because client-based logging is an effective way to reduce dependencies on server CPU and disk resources and, thus, prevents the server from becoming a performance bottleneck as quickly when the number of clients accessing the database increases.
Resumo:
This paper examines how and why web server performance changes as the workload at the server varies. We measure the performance of a PC acting as a standalone web server, running Apache on top of Linux. We use two important tools to understand what aspects of software architecture and implementation determine performance at the server. The first is a tool that we developed, called WebMonitor, which measures activity and resource consumption, both in the operating system and in the web server. The second is the kernel profiling facility distributed as part of Linux. We vary the workload at the server along two important dimensions: the number of clients concurrently accessing the server, and the size of the documents stored on the server. Our results quantify and show how more clients and larger files stress the web server and operating system in different and surprising ways. Our results also show the importance of fixed costs (i.e., opening and closing TCP connections, and updating the server log) in determining web server performance.
Resumo:
A method for reconstructing 3D rational B-spline surfaces from multiple views is proposed. The method takes advantage of the projective invariance properties of rational B-splines. Given feature correspondences in multiple views, the 3D surface is reconstructed via a four step framework. First, corresponding features in each view are given an initial surface parameter value (s; t), and a 2D B-spline is fitted in each view. After this initialization, an iterative minimization procedure alternates between updating the 2D B-spline control points and re-estimating each feature's (s; t). Next, a non-linear minimization method is used to upgrade the 2D B-splines to 2D rational B-splines, and obtain a better fit. Finally, a factorization method is used to reconstruct the 3D B-spline surface given 2D B-splines in each view. This surface recovery method can be applied in both the perspective and orthographic case. The orthographic case allows the use of additional constraints in the recovery. Experiments with real and synthetic imagery demonstrate the efficacy of the approach for the orthographic case.
Resumo:
Most associative memory models perform one level mapping between predefined sets of input and output patterns1 and are unable to represent hierarchical knowledge. Complex AI systems allow hierarchical representation of concepts, but generally do not have learning capabilities. In this paper, a memory model is proposed which forms concept hierarchy by learning sample relations between concepts. All concepts are represented in a concept layer. Relations between a concept and its defining lower level concepts, are chunked as cognitive codes represented in a coding layer. By updating memory contents in the concept layer through code firing in the coding layer, the system is able to perform an important class of commonsense reasoning, namely recognition and inheritance.
Resumo:
The child is the most precious asset and the focal point of development for any country. However, unless children are brought up in a stimulating and conducive environment getting the best possible care and protection, their physical, mental, emotional and social development is susceptible to permanent damage. Ethiopia, being one of the least developed countries of the world due to interrelated and complex socio-economic factors including man-made and natural calamities, a large portion of our population - especially children - are victimized by social evils like famine, disease, poverty, mass displacement, lack of education and family instability. Owing to the fact that children are the most vulnerable group among the whole society and also because they constitute half of the population it is evident that a considerable number of Ethiopian children are living under difficult circumstances. Therefore, as in a number of other third world countries there are many poor, displaced, unaccompanied and orphaned children in our country. A considerable proportion of these children work on the street with some even totally living on the street without any adult care and protection. These children are forced to the streets in their tight for survival. They supplement their parents meagre income or support themselves with the small incomes they earn doing menial jobs. In doing this, street children face the danger of getting into accidents and violence, they get exploited and abused, many are forced to drop out of school or never get the chance to be enroled at all and some drift into begging or petty crime. This study is undertaken mainly for updating the findings of previous studies, monitoring changing trends, examining new facts of the problem and getting a better understanding of the phenomenon in the country by covering at least some of the major centres where the problem is acute. Thus, the outcome of this research can be useful in the formation of the social welfare programme of the country. Finally, in recognition of the urgency of the problem and the limited resources available, the Ministry of Labour and Social Affairs expresses appreciation to all agencies engaged in the rehabilitation of street children and prevention of the problem. The Ministry also calls for more co-operation and support between concerned governmental and non-governmental organizations in their efforts for improving the situation of street children and in curbing the overwhelming nature of the problem.
Resumo:
BACKGROUND: Over the past two decades, genomics has evolved as a scientific research discipline. Genomics research was fueled initially by government and nonprofit funding sources, later augmented by private research and development (R&D) funding. Citizens and taxpayers of many countries have funded much of the research, and have expectations about access to the resulting information and knowledge. While access to knowledge gained from all publicly funded research is desired, access is especially important for fields that have broad social impact and stimulate public dialogue. Genomics is one such field, where public concerns are raised for reasons such as health care and insurance implications, as well as personal and ancestral identification. Thus, genomics has grown rapidly as a field, and attracts considerable interest. RESULTS: One way to study the growth of a field of research is to examine its funding. This study focuses on public funding of genomics research, identifying and collecting data from major government and nonprofit organizations around the world, and updating previous estimates of world genomics research funding, including information about geographical origins. We initially identified 89 publicly funded organizations; we requested information about each organization's funding of genomics research. Of these organizations, 48 responded and 34 reported genomics research expenditures (of those that responded but did not supply information, some did not fund such research, others could not quantify it). The figures reported here include all the largest funders and we estimate that we have accounted for most of the genomics research funding from government and nonprofit sources. CONCLUSION: Aggregate spending on genomics research from 34 funding sources averaged around $2.9 billion in 2003-2006. The United States spent more than any other country on genomics research, corresponding to 35% of the overall worldwide public funding (compared to 49% US share of public health research funding for all purposes). When adjusted to genomics funding intensity, however, the United States dropped below Ireland, the United Kingdom, and Canada, as measured both by genomics research expenditure per capita and per Gross Domestic Product.
Resumo:
There are currently over two million Palestinian refugees residing in Jordan, 370,000 of whom reside in refugee camps. Due to conflict-affiliated disease outbreaks among children in the region, the UN Relief and Works Agency for Palestine Refugees (UNRWA) has identified incomplete vaccination as a critical public health issue and has invested in the development and implementation of a text message reminder service for preventing loss-to-follow-up. Childhood immunization rates in UNRWA catchment regions are generally high, yet little is known about risk factors for missed appointments, which impose a substantial administrative burden due to the need to contact patients for rescheduling. Stronger user characterization is necessary for improved targeting and minimized cost as we develop a more robust SMS system capable of scaling across all health facilities.
This mixed-methods study prospectively recorded 6 months of immunization history among a cohort of children born in June 2014 at Taybeh Health Center in Amman. Demographic information was collected at the time of birth, and caregivers of cohort members were invited to participate in interviews that assessed immunization knowledge, preferences, decision-making, and experience with the SMS reminder system. Patients were more likely to significantly delay appointments during the Ramadan holiday and for doses further from the child date of birth. Future policies that might bridge these gaps include targeting pre-appointment SMS reminders to high-risk patients, implementing holiday shifts in clinic hours, and regularly updating patient contact information.
Resumo:
In this paper, a knowledge-based approach is proposed for the management of temporal information in process control. A common-sense theory of temporal constraints over processes/events, allowing relative temporal knowledge, is employed here as the temporal basis for the system. This theory supports duration reasoning and consistency checking, and accepts relative temporal knowledge which is in a form normally used by human operators. An architecture for process control is proposed which centres on an historical database consisting of events and processes, together with the qualitative temporal relationships between their occurrences. The dynamics of the system is expressed by means of three types of rule: database updating rules, process control rules, and data deletion rules. An example is provided in the form of a life scheduler, to illustrate the database and the rule sets. The example demonstrates the transitions of the database over time, and identifies the procedure in terms of a state transition model for the application. The dividing instant problem for logical inference is discussed with reference to this process control example, and it is shown how the temporal theory employed can be used to deal with the problem.
Resumo:
This article describes ongoing research on developing a portal framework based on the OASIS Web Services for Remote Portlets (WSRP) standard for integration of Web-based education contents and services made available through a model for a European Networked University. We first identify the requirements for such a framework that supports integration at the presentation level and collaboration in developing and updating study programmes and course materials. We then outline the architecture design, and report on the initial implementation and preliminary evaluation.
Resumo:
This is a briefing report on when the safety issues identified in a July 2008 report by Jülich should have become apparent In July 2008, the German Jülich nuclear research centre published a report entitled ‘A safety re-evaluation of the AVR pebble bed reactor operation and its consequences for future HTR concepts.’ It concluded: ‘pebble bed HTRs require additional safety related R&D effort and updating of safety analyses before construction.’
Resumo:
Executive Summary 1. The Marine Life Information Network (MarLIN) has been developed since 1998. Defra funding has supported a core part of its work, the Biology and Sensitivity Key Information Sub-programme. This report relates to Biology and Sensitivity work for the period 2001-2004. 2. MarLIN Biology and Sensitivity research takes information on the biology of species to identify the likely effects of changing environmental conditions linked to human activities on those species. In turn, species that are key functional, key structural, dominant, or characteristic in a biotope (the habitat and its associated species) are used to identify biotope sensitivity. Results are displayed over the World Wide Web and can be accessed via a range of search tools that make the information of relevance to environmental management. 3. The first Defra contract enabled the development of criteria and methods of research, database storage methods and the research of a wide range of species. A contract from English Nature and Scottish Natural Heritage enabled biotopes relevant to marine SACs to be researched. 4. Defra funding in 2001-2004 has especially enabled recent developments to be targeted for research. Those developments included the identification of threatened and declining species by the OSPAR Biodiversity Committee, the development of a new approach to defining sensitivity (part of the Review of Marine Nature Conservation), and the opportunity to use Geographical Information Systems (GIS) more effectively to link survey data to MarLIN assessments of sensitivity. 5. The MarLIN database has been developed to provide a resource to 'pick-and-mix' information depending on the questions being asked. Using GIS, survey data that provides locations for species and biotopes has been linked to information researched by MarLIN to map the likely sensitivity of an area to a specified factor. Projects undertaken for the Irish Sea pilot (marine landscapes), in collaboration with CEFAS (fishing impacts) and with the Countryside Council for Wales (oil spill response) have demonstrated the application of MarLIN information linked to survey data in answering, through maps, questions about likely impacts of human activities on seabed ecosystems. 6. GIS applications that use MarLIN sensitivity information give meaningful results when linked to localized and detailed survey information (lists of species and biotopes as point source or mapped extents). However, broad landscape units require further interpretation. 7. A new mapping tool (SEABED map) has been developed to display data on species distributions and survey data according to search terms that might be used by an environmental manager. 8. MarLIN outputs are best viewed on the Web site where the most up-to-date information from live databases is available. The MarLIN Web site receives about 1600 visits a day. 9. The MarLIN approach to assessing sensitivity and its application to environmental management were presented in papers at three international conferences during the current contract and a 'touchstone' paper is to be published in the peer-reviewed journal Hydrobiologia. The utility of MarLIN information for environmental managers, amongst other sorts of information, has been described in an article in Marine Pollution Bulletin. 10. MarLIN information is being used to inform the identification of potential indicator species for implementation of the Water Framework Directive including initiatives by ICES. 11. Non-Defra funding streams are supporting the updating of reviews and increasing the amount of peer review undertaken; both of which are important to the maintenance of the resource. However, whilst MarLIN information is sufficiently wide ranging to be used in an 'operational' way for marine environmental protection and management, new initiatives and the new biotopes classification have introduced additional species and biotopes that will need to be researched in the future. 12. By the end of the contract, the Biology and Sensitivity Key Information database contained full Key Information reviews on 152 priority species and 117 priority biotopes, together with basic information on 412 species; a total of 564 marine benthic species.