885 resultados para Context data
Resumo:
OBJECTIVE There is controversy regarding the significance of radiological consolidation in the context of COPD exacerbation (eCOPD). While some studies into eCOPD exclude these cases, consolidation is a common feature of eCOPD admissions in real practice. This study aims to address the question of whether consolidation in eCOPD is a distinct clinical phenotype with implications for management decisions and outcomes. PATIENTS AND METHODS The European COPD Audit was carried out in 384 hospitals from 13 European countries between 2010 and 2011 to analyze guideline adherence in eCOPD. In this analysis, admissions were split according to the presence or not of consolidation on the admission chest radiograph. Groups were compared in terms of clinical and epidemiological features, existing treatment, clinical care utilized and mortality. RESULTS 14,111 cases were included comprising 2,714 (19.2%) with consolidation and 11,397 (80.8%) without. The risk of radiographic consolidation increased with age, female gender, cardiovascular diseases, having had two or more admissions in the previous year, and sputum color change. Previous treatment with inhaled steroids was not associated. Patients with radiographic consolidation were significantly more likely to receive antibiotics, oxygen and non-invasive ventilation during the admission and had a lower survival from admission to 90-day follow-up. CONCLUSIONS Patients admitted for COPD exacerbation who have radiological consolidation have a more severe illness course, are treated more intensively by clinicians and have a poorer prognosis. We recommend that these patients be considered a distinct subset in COPD exacerbation.
Resumo:
Context. We interpret multicolor data from OSIRIS NAC for the remote-sensing exploration of comet 67P/Churyumov-Gerasimenko. Aims. We determine the most meaningful definition of color maps for the characterization of surface variegation with filters available on OSIRIS NAC. Methods. We analyzed laboratory spectra of selected minerals and olivine-pyroxene mixtures seen through OSIRIS NAC filters, with spectral methods existing in the literature: reflectance ratios, minimum band wavelength, spectral slopes, band tilt, band curvature, and visible tilt. Results. We emphasize the importance of reflectance ratios and particularly the relation of visible tilt vs. band tilt. This technique provides a reliable diagnostic of the presence of silicates. Color maps constructed by red-green-blue colors defined with the green, orange, red, IR, and Fe2O3 filters let us define regions that may significantly differ in composition.
Resumo:
A wide variety of spatial data collection efforts are ongoing throughout local, state and federal agencies, private firms and non-profit organizations. Each effort is established for a different purpose but organizations and individuals often collect and maintain the same or similar information. The United States federal government has undertaken many initiatives such as the National Spatial Data Infrastructure, the National Map and Geospatial One-Stop to reduce duplicative spatial data collection and promote the coordinated use, sharing, and dissemination of spatial data nationwide. A key premise in most of these initiatives is that no national government will be able to gather and maintain more than a small percentage of the geographic data that users want and desire. Thus, national initiatives depend typically on the cooperation of those already gathering spatial data and those using GIs to meet specific needs to help construct and maintain these spatial data infrastructures and geo-libraries for their nations (Onsrud 2001). Some of the impediments to widespread spatial data sharing are well known from directly asking GIs data producers why they are not currently involved in creating datasets that are of common or compatible formats, documenting their datasets in a standardized metadata format or making their datasets more readily available to others through Data Clearinghouses or geo-libraries. The research described in this thesis addresses the impediments to wide-scale spatial data sharing faced by GIs data producers and explores a new conceptual data-sharing approach, the Public Commons for Geospatial Data, that supports user-friendly metadata creation, open access licenses, archival services and documentation of parent lineage of the contributors and value- adders of digital spatial data sets.
Resumo:
Lovell and Rouse (LR) have recently proposed a modification of the standard DEA model that overcomes the infeasibility problem often encountered in computing super-efficiency. In the LR procedure one appropriately scales up the observed input vector (scale down the output vector) of the relevant super-efficient firm thereby usually creating its inefficient surrogate. An alternative procedure proposed in this paper uses the directional distance function introduced by Chambers, Chung, and Färe and the resulting Nerlove-Luenberger (NL) measure of super-efficiency. The fact that the directional distance function combines features of both an input-oriented and an output-oriented model, generally leads to a more complete ranking of the observations than either of the oriented models. An added advantage of this approach is that the NL super-efficiency measure is unique and does not depend on any arbitrary choice of a scaling parameter. A data set on international airlines from Coelli, Perelman, and Griffel-Tatje (2002) is utilized in an illustrative empirical application.
Resumo:
We are confident of many of the judgements we make as to what sorts of alterations the members of nature’s kinds can survive, and what sorts of events mark the ends of their existences. But is our confidence based on empirical observation of nature’s kinds and their members? Conventionalists deny that we can learn empirically which properties are essential to the members of nature’s kinds. Judgements of sameness in kind between members, and of numerical sameness of a member across time, merely project our conventions of individuation. Our confidence is warranted because apart from those conventions there are no phenomena of kind-sameness or of numerical sameness across time. There is just “stuff” displaying properties. This paper argues that conventionalists can assign no properties to the “stuff” beyond immediate phenomenal properties. Consequently they cannot explain how each of us comes to be able to wield “our conventions”.
Resumo:
This research examines the site and situation characteristics of community trails as landscapes promoting physical activity. Trail segment and neighborhood characteristics for six trails in urban, suburban, and exurban towns in northeastern Massachusetts were assessed from primary Global Positioning System (GPS) data and from secondary Census and land use data integrated in a geographic information system (GIS). Correlations between neighborhood street and housing density, land use mix, and sociodemographic characteristics and trail segment characteristics and amenities measure the degree to which trail segment attributes are associated with the surrounding neighborhood characteristics.
Resumo:
Digital terrain models (DTM) typically contain large numbers of postings, from hundreds of thousands to billions. Many algorithms that run on DTMs require topological knowledge of the postings, such as finding nearest neighbors, finding the posting closest to a chosen location, etc. If the postings are arranged irregu- larly, topological information is costly to compute and to store. This paper offers a practical approach to organizing and searching irregularly-space data sets by presenting a collection of efficient algorithms (O(N),O(lgN)) that compute important topological relationships with only a simple supporting data structure. These relationships include finding the postings within a window, locating the posting nearest a point of interest, finding the neighborhood of postings nearest a point of interest, and ordering the neighborhood counter-clockwise. These algorithms depend only on two sorted arrays of two-element tuples, holding a planimetric coordinate and an integer identification number indicating which posting the coordinate belongs to. There is one array for each planimetric coordinate (eastings and northings). These two arrays cost minimal overhead to create and store but permit the data to remain arranged irregularly.
Resumo:
Many datasets used by economists and other social scientists are collected by stratified sampling. The sampling scheme used to collect the data induces a probability distribution on the observed sample that differs from the target or underlying distribution for which inference is to be made. If this effect is not taken into account, subsequent statistical inference can be seriously biased. This paper shows how to do efficient semiparametric inference in moment restriction models when data from the target population is collected by three widely used sampling schemes: variable probability sampling, multinomial sampling, and standard stratified sampling.
Resumo:
Despite the extensive work on currency mismatches, research on the determinants and effects of maturity mismatches is scarce. In this paper I show that emerging market maturity mismatches are negatively affected by capital inflows and price volatilities. Furthermore, I find that banks with low maturity mismatches are more profitable during crisis periods but less profitable otherwise. The later result implies that banks face a tradeoff between higher returns and risk, hence channeling short term capital into long term loans is caused by cronyism and implicit guarantees rather than the depth of the financial market. The positive relationship between maturity mismatches and price volatility, on the other hand, shows that the banks of countries with high exchange rate and interest rate volatilities can not, or choose not to hedge themselves. These results follow from a panel regression on a data set I constructed by merging bank level data with aggregate data. This is advantageous over traditional studies which focus only on aggregate data.
Resumo:
A problem frequently encountered in Data Envelopment Analysis (DEA) is that the total number of inputs and outputs included tend to be too many relative to the sample size. One way to counter this problem is to combine several inputs (or outputs) into (meaningful) aggregate variables reducing thereby the dimension of the input (or output) vector. A direct effect of input aggregation is to reduce the number of constraints. This, in its turn, alters the optimal value of the objective function. In this paper, we show how a statistical test proposed by Banker (1993) may be applied to test the validity of a specific way of aggregating several inputs. An empirical application using data from Indian manufacturing for the year 2002-03 is included as an example of the proposed test.
Resumo:
This paper extends the existing research on real estate investment trust (REIT) operating efficiencies. We estimate a stochastic-frontier panel-data model specifying a translog cost function, covering 1995 to 2003. The results disagree with previous research in that we find little evidence of scale economies and some evidence of scale diseconomies. Moreover, we also generally find smaller inefficiencies than those shown by other REIT studies. Contrary to previous research, the results also show that self-management of a REIT associates with more inefficiency when we measure output with assets. When we use revenue to measure output, selfmanagement associates with less inefficiency. Also contrary with previous research, higher leverage associates with more efficiency. The results further suggest that inefficiency increases over time in three of our four specifications.
Resumo:
This paper evaluates inflation targeting and assesses its merits by comparing alternative targets in a macroeconomic model. We use European aggregate data to evaluate the performance of alternative policy rules under alternative inflation targets in terms of output losses. We employ two major alternative policy rules, forward-looking and spontaneous adjustment, and three alternative inflation targets, zero percent, two percent, and four percent inflation rates. The simulation findings suggest that forward-looking rules contributed to macroeconomic stability and increase monetary policy credibility. The superiority of a positive inflation target, in terms of output losses, emerges for the aggregate data. The same methodology, when applied to individual countries, however, suggests that country-specific flexible inflation targeting can improve employment prospects in Europe.
Resumo:
The Indian textiles industry is now at the crossroads with the phasing out of quota regime that prevailed under the Multi-Fiber Agreement (MFA) until the end of 2004. In the face of a full integration of the textiles sector in the WTO, maintaining and enhancing productive efficiency is a precondition for competitiveness of the Indian firms in the new liberalized world market. In this paper we use data obtained from the Annual Survey of Industries for a number of years to measure the levels of technical efficiency in the Indian textiles industry at the firm level. We use both a grand frontier applicable to all firms and a group frontier specific to firms from any individual state, ownership, or organization type in order to evaluate their efficiencies. This permits us to separately identify how locational, proprietary, and organizational characteristics of a firm affect its performance.