976 resultados para Information Requirements: Data Availability
Resumo:
The Balkan Vegetation Database (BVD; GIVD ID: EU-00-019; http://www.givd.info/ID/EU-00- 019) is a regional database that consists of phytosociological relevés from different vegetation types from six countries on the Balkan Peninsula (Albania, Bosnia and Herzegovina, Bulgaria, Kosovo, Montenegro and Serbia). Currently, it contains 9,580 relevés, and most of them (78%) are geo-referenced. The database includes digitized relevés from the literature (79%) and unpublished data (21%). Herein we present descriptive statistics about attributive relevé information. We developed rules that regulate governance of the database, data provision, types of data availability regimes, data requests and terms of use, authorships and relationships with other databases. The database offers an extensive overview about studies on the local, regional and SE European levels including information about flora, vegetation and habitats.
Resumo:
Each issue consists of 6 or more v., with each covering a range of individual states.
Resumo:
Sustainable management of coastal and coral reef environments requires regular collection of accurate information on recognized ecosystem health indicators. Satellite image data and derived maps of water column and substrate biophysical properties provide an opportunity to develop baseline mapping and monitoring programs for coastal and coral reef ecosystem health indicators. A significant challenge for satellite image data in coastal and coral reef water bodies is the mixture of both clear and turbid waters. A new approach is presented in this paper to enable production of water quality and substrate cover type maps, linked to a field based coastal ecosystem health indicator monitoring program, for use in turbid to clear coastal and coral reef waters. An optimized optical domain method was applied to map selected water quality (Secchi depth, Kd PAR, tripton, CDOM) and substrate cover type (seagrass, algae, sand) parameters. The approach is demonstrated using commercially available Landsat 7 Enhanced Thematic Mapper image data over a coastal embayment exhibiting the range of substrate cover types and water quality conditions commonly found in sub-tropical and tropical coastal environments. Spatially extensive and quantitative maps of selected water quality and substrate cover parameters were produced for the study site. These map products were refined by interactions with management agencies to suit the information requirements of their monitoring and management programs. (c) 2004 Elsevier Ltd. All rights reserved.
Resumo:
Reliable, comparable information about the main causes of disease and injury in populations, and how these are changing, is a critical input for debates about priorities in the health sector. Traditional sources of information about the descriptive epidemiology of diseases, injuries and risk factors are generally incomplete, fragmented and of uncertain reliability and comparability. Lack of a standardized measurement framework to permit comparisons across diseases and injuries, as well as risk factors, and failure to systematically evaluate data quality have impeded comparative analyses of the true public health importance of various conditions and risk factors. As a consequence the impact of major conditions and hazards on population health has been poorly appreciated, often leading to a lack of public health investment. Global disease and risk factor quantification improved dramatically in the early 1990s with the completion of the first Global Burden of Disease Study. For the first time, the comparative importance of over 100 diseases and injuries, and ten major risk factors, for global and regional health status could be assessed using a common metric (Disability-Adjusted Life Years) which simultaneously accounted for both premature mortality and the prevalence, duration and severity of the non-fatal consequences of disease and injury. As a consequence, mental health conditions and injuries, for which non-fatal outcomes are of particular significance, were identified as being among the leading causes of disease/injury burden worldwide, with clear implications for policy, particularly prevention. A major achievement of the Study was the complete global descriptive epidemiology, including incidence, prevalence and mortality, by age, sex and Region, of over 100 diseases and injuries. National applications, further methodological research and an increase in data availability have led to improved national, regional and global estimates for 2000, but substantial uncertainty around the disease burden caused by major conditions, including, HIV, remains. The rapid implementation of cost-effective data collection systems in developing countries is a key priority if global public policy to promote health is to be more effectively informed.
Resumo:
Background and purpose Survey data quality is a combination of the representativeness of the sample, the accuracy and precision of measurements, data processing and management with several subcomponents in each. The purpose of this paper is to show how, in the final risk factor surveys of the WHO MONICA Project, information on data quality were obtained, quantified, and used in the analysis. Methods and results In the WHO MONICA (Multinational MONItoring of trends and determinants in CArdiovascular disease) Project, the information about the data quality components was documented in retrospective quality assessment reports. On the basis of the documented information and the survey data, the quality of each data component was assessed and summarized using quality scores. The quality scores were used in sensitivity testing of the results both by excluding populations with low quality scores and by weighting the data by its quality scores. Conclusions Detailed documentation of all survey procedures with standardized protocols, training, and quality control are steps towards optimizing data quality. Quantifying data quality is a further step. Methods used in the WHO MONICA Project could be adopted to improve quality in other health surveys.
Resumo:
This thesis presents the formal definition of a novel Mobile Cloud Computing (MCC) extension of the Networked Autonomic Machine (NAM) framework, a general-purpose conceptual tool which describes large-scale distributed autonomic systems. The introduction of autonomic policies in the MCC paradigm has proved to be an effective technique to increase the robustness and flexibility of MCC systems. In particular, autonomic policies based on continuous resource and connectivity monitoring help automate context-aware decisions for computation offloading. We have also provided NAM with a formalization in terms of a transformational operational semantics in order to fill the gap between its existing Java implementation NAM4J and its conceptual definition. Moreover, we have extended NAM4J by adding several components with the purpose of managing large scale autonomic distributed environments. In particular, the middleware allows for the implementation of peer-to-peer (P2P) networks of NAM nodes. Moreover, NAM mobility actions have been implemented to enable the migration of code, execution state and data. Within NAM4J, we have designed and developed a component, denoted as context bus, which is particularly useful in collaborative applications in that, if replicated on each peer, it instantiates a virtual shared channel allowing nodes to notify and get notified about context events. Regarding the autonomic policies management, we have provided NAM4J with a rule engine, whose purpose is to allow a system to autonomously determine when offloading is convenient. We have also provided NAM4J with trust and reputation management mechanisms to make the middleware suitable for applications in which such aspects are of great interest. To this purpose, we have designed and implemented a distributed framework, denoted as DARTSense, where no central server is required, as reputation values are stored and updated by participants in a subjective fashion. We have also investigated the literature regarding MCC systems. The analysis pointed out that all MCC models focus on mobile devices, and consider the Cloud as a system with unlimited resources. To contribute in filling this gap, we defined a modeling and simulation framework for the design and analysis of MCC systems, encompassing both their sides. We have also implemented a modular and reusable simulator of the model. We have applied the NAM principles to two different application scenarios. First, we have defined a hybrid P2P/cloud approach where components and protocols are autonomically configured according to specific target goals, such as cost-effectiveness, reliability and availability. Merging P2P and cloud paradigms brings together the advantages of both: high availability, provided by the Cloud presence, and low cost, by exploiting inexpensive peers resources. As an example, we have shown how the proposed approach can be used to design NAM-based collaborative storage systems based on an autonomic policy to decide how to distribute data chunks among peers and Cloud, according to cost minimization and data availability goals. As a second application, we have defined an autonomic architecture for decentralized urban participatory sensing (UPS) which bridges sensor networks and mobile systems to improve effectiveness and efficiency. The developed application allows users to retrieve and publish different types of sensed information by using the features provided by NAM4J's context bus. Trust and reputation is managed through the application of DARTSense mechanisms. Also, the application includes an autonomic policy that detects areas characterized by few contributors, and tries to recruit new providers by migrating code necessary to sensing, through NAM mobility actions.
Resumo:
The present study describes a pragmatic approach to the implementation of production planning and scheduling techniques in foundries of all types and looks at the use of `state-of-the-art' management control and information systems. Following a review of systems for the classification of manufacturing companies, a definitive statement is made which highlights the important differences between foundries (i.e. `component makers') and other manufacturing companies (i.e. `component buyers'). An investigation of the manual procedures which are used to plan and control the manufacture of components reveals the inherent problems facing foundry production management staff, which suggests the unsuitability of many manufacturing techniques which have been applied to general engineering companies. From the literature it was discovered that computer-assisted systems are required which are primarily `information-based' rather than `decision based', whilst the availability of low-cost computers and `packaged-software' has enabled foundries to `get their feet wet' without the financial penalties which characterized many of the early attempts at computer-assistance (i.e. pre-1980). Moreover, no evidence of a single methodology for foundry scheduling emerged from the review. A philosophy for the development of a CAPM system is presented, which details the essential information requirements and puts forward proposals for the subsequent interactions between types of information and the sub-system of CAPM which they support. The work developed was oriented specifically at the functions of production planning and scheduling and introduces the concept of `manual interaction' for effective scheduling. The techniques developed were designed to use the information which is readily available in foundries and were found to be practically successful following the implementation of the techniques into a wide variety of foundries. The limitations of the techniques developed are subsequently discussed within the wider issues which form a CAPM system, prior to a presentation of the conclusions which can be drawn from the study.
Resumo:
The value of knowing about data availability and system accessibility is analyzed through theoretical models of Information Economics. When a user places an inquiry for information, it is important for the user to learn whether the system is not accessible or the data is not available, rather than not have any response. In reality, various outcomes can be provided by the system: nothing will be displayed to the user (e.g., a traffic light that does not operate, a browser that keeps browsing, a telephone that does not answer); a random noise will be displayed (e.g., a traffic light that displays random signals, a browser that provides disorderly results, an automatic voice message that does not clarify the situation); a special signal indicating that the system is not operating (e.g., a blinking amber indicating that the traffic light is down, a browser responding that the site is unavailable, a voice message regretting to tell that the service is not available). This article develops a model to assess the value of the information for the user in such situations by employing the information structure model prevailing in Information Economics. Examples related to data accessibility in centralized and in distributed systems are provided for illustration.
Resumo:
Стефанка Чукова, Хър Гуан Тео - В това изследване разглеждаме и разширяваме предишната ни работа по цензуриране, типично за авто гаранционни данни. За да разрешим проблема с непълната информация за километража, използваме линеен подход в непараметрични рамки. Оценяваме средните кумулативни гаранционни разходи (за превозно средство) и стандартната им грешка като функция на възрастта, на километража и на реалното (календарно) време.
Resumo:
Motivation: In any macromolecular polyprotic system - for example protein, DNA or RNA - the isoelectric point - commonly referred to as the pI - can be defined as the point of singularity in a titration curve, corresponding to the solution pH value at which the net overall surface charge - and thus the electrophoretic mobility - of the ampholyte sums to zero. Different modern analytical biochemistry and proteomics methods depend on the isoelectric point as a principal feature for protein and peptide characterization. Protein separation by isoelectric point is a critical part of 2-D gel electrophoresis, a key precursor of proteomics, where discrete spots can be digested in-gel, and proteins subsequently identified by analytical mass spectrometry. Peptide fractionation according to their pI is also widely used in current proteomics sample preparation procedures previous to the LC-MS/MS analysis. Therefore accurate theoretical prediction of pI would expedite such analysis. While such pI calculation is widely used, it remains largely untested, motivating our efforts to benchmark pI prediction methods. Results: Using data from the database PIP-DB and one publically available dataset as our reference gold standard, we have undertaken the benchmarking of pI calculation methods. We find that methods vary in their accuracy and are highly sensitive to the choice of basis set. The machine-learning algorithms, especially the SVM-based algorithm, showed a superior performance when studying peptide mixtures. In general, learning-based pI prediction methods (such as Cofactor, SVM and Branca) require a large training dataset and their resulting performance will strongly depend of the quality of that data. In contrast with Iterative methods, machine-learning algorithms have the advantage of being able to add new features to improve the accuracy of prediction. Contact: yperez@ebi.ac.uk Availability and Implementation: The software and data are freely available at https://github.com/ypriverol/pIR. Supplementary information: Supplementary data are available at Bioinformatics online.
Resumo:
We have limited knowledge on the potential pattern similarities/differences of trust’s role that may exist in information use obtained through intra- and extra-organizational relationships. This study addresses this question by investigating how trust leads to information use. Data from 338 intra-organizational and a sub-ample of 158 interorganizational dyadic information exchange-relationships showed that trust is an important driver of the utilization of market information in both cases. Trust has no direct relationship to information use, instead has a strong indirect effect through a mediator, perceived quality of information. The effects of trust on the use of information obtained through inter- and extra-organizational dyadic relationships proved to be similar.
Resumo:
Since the middle of 1970´s the world has been undergoing significant transformations, comes in a new era of global capital accumulation, and starts the called productive restructuring. This restructuring is materialized spatially through the reconfiguration of territory, redefining its uses and providing a new spatial structure. With regard to native of Rio Grande do Norte territory, there is, from the 1980s, the emergence of new economic activities, among which stand out tourism, which become stimulated from government policies. In this context, has stood the east coast of the state, because its beaches have recognized scenic appeal and environmental quality. Therewith business opportunity, international investors ended up investing in this portion of Rio Grande do Norte´s territory, especially over the last decade. The expansion of this process, to the north coast, resulted in the emergence of intense property dynamics in the municipalities of Maxaranguape and Rio do Fogo. The low value of the property and buildings, compared to the European market and the availability of real property, were the main factors that explain the attraction of such investors, who are now seen as new business opportunities with high rates of profitability, in tropical areas, hitherto remote geographical location of the economic system. Therefore, the objective of the research is to analyze to what extent the International Investment promoted the appreciation of property in the municipalities of Maxaranguape and Rio do Fogo. The time frame covers the period among 2000- 2013. The methodology consisted of the following proceedings: surveys and analysis of data collected in the property registry office of the Maxaranguape and Rio do Fogo municipalities; interviews with public and private officials that were important for the analysis of spatial transformations and the recovery of the property that occurred in the municipalities studied; collection of secondary data from official bodies, such as IBGE, MTUR, SETUR, BNB etc. Analyzing the information and data that have been cataloged, it was concluded that this investments are reinforcing old leisure and tourism practices existing in the past in those territories and, shifted (creating new territorial arrangements) a significant part of the eastern coast zone areas of the state. Another consequence connected to this recent phenomenon refers to property’s increase in value that has occurred in this part of the state, episode which is directly connected to that event. Therefore, it is realize that the expansion and the incorporation of the capital territories reveal, in part, the strategies of the capitalist mode of production, which are evident in the search for better conditions of accumulation, expanding the alternative of use of the properties which occurs in selective way and uneven in the geographic space form. It is observed that the mechanisms that capital makes use to impose their practices can happen through the property’s increase in value market, meaning, thereby, that the reproduction of imbalances happens, often, through the marked property speculation with the fast increase in value of properties.
Resumo:
Ireland’s climate is changing. This is consistent with regional and global trends which display rapid changes in many aspects of climate over the last century and the first decade of this century. The availability of high-quality climate observations is a critical starting point from which an understanding of past and emerging trends in the current climate can be developed. Such observations are vital for detecting change and providing the information needed to help manage and plan for the future in a wide range of socio-economic sectors. Observations are also essential to help build robust projections of future climate, which can in turn inform policy formulation for appropriate mitigation and adaptation measures. Such measures should help us limit the negative socio-economic impacts and position us to take advantages of opportunities offered by a changing climate. This report brings together observational information and data for over 40 climate variables and highlights changes and trends in aspects of Irish climate across the atmospheric, oceanic and terrestrial domains. The observations presented in this report contribute to the formulation of the Essential Climate Variables (ECVs) as defined by the Global Climate Observing System (GCOS).
Resumo:
Prior finance literature lacks a comprehensive analysis of microstructure characteristics of U.S. futures markets due to the lack of data availability. Utilizing a unique data set for five different futures contract this dissertation fills this gap in the finance literature. In three essays price discovery, resiliency and the components of bid-ask spreads in electronic futures markets are examined. In order to provide comprehensive and robust analysis, both moderately volatile pre-crisis and volatile crisis periods are included in the analysis. The first essay entitled “Price Discovery and Liquidity Characteristics for U.S. Electronic Futures and ETF Markets” explores the price discovery process in U.S. futures and ETF markets. Hasbrouck’s information share method is applied to futures and ETF instruments. The information share results show that futures markets dominate the price discovery process. The results on the factors that affect the price discovery process show that when volatility increases, the price leadership of futures markets declines. Furthermore, when the relative size of bid-ask spread in one market increases, its information share decreases. The second essay, entitled “The Resiliency of Large Trades for U.S. Electronic Futures Markets,“ examines the effects of large trades in futures markets. How quickly prices and liquidity recovers after large trades is an important characteristic of financial markets. The price effects of large trades are greater during the crisis period compared to the pre-crisis period. Furthermore, relative to the pre-crisis period, during the crisis period it takes more trades until liquidity returns to the pre-block trade levels. The third essay, entitled “Components of Quoted Bid-Ask Spreads in U.S. Electronic Futures Markets,” investigates the bid-ask spread components in futures market. The components of bid-ask spreads is one of the most important subjects of microstructure studies. Utilizing Huang and Stoll’s (1997) method the third essay of this dissertation provides the first analysis of the components of quoted bid-ask spreads in U.S. electronic futures markets. The results show that order processing cost is the largest component of bid-ask spreads, followed by inventory holding costs. During the crisis period market makers increase bid-ask spreads due to increasing inventory holding and adverse selection risks.